Seeing at Billionths and Billionths

Sketch10381924
This was my (unsuccessful) entry into last year’s Wellcome Trust Science Writing Prize. It is very similar to the post I published on the 24th of June, taking a slightly different bent on the same theme.

The Latin word skopein underlies the etymology of a set of instruments that laid the foundations for our modern understanding of nature: microscopes. References to the word are recognizable across language barriers thanks to the pervasive influence of the ancient language of scholars, and common usage gives us hints as to the meaning. We scope out a new situation, implying that we not only give a cursory glance but also take some measure or judgement.

Drops of glass held in brass enabled Robert Hooke and Anton van Leuwenhoek to make observations that gave rise to germ theory. Light microscopy unveiled our friends and foes in bacteria, replacing humours and miasmas as primary effectors driving human health and disease. The concept of miasmatic disease, a term that supposes disease is caused by tainted air, is now so far-fetched the term has been almost entirely lost to time. The bird-like masks worn by plague doctors were stuffed with potpourri: the thinking of the time was that fragrance alone could protect against the miasma of black death. The idea seems silly to us now, thanks to the fruits of our inquiry. The cells described by Hooke and “animalcules” seen by Leuwenhoek marked a transition from a world operated by invisible forces to one in which the mechanisms of nature were vulnerable to human scrutiny. In short, science was born in the backs of our eyes.

The ability of an observer using an optical instrument to differentiate between two objects has, until recently, been limited by the tendency of waves to bend at boundaries, a phenomenon known as diffraction. The limiting effects of diffraction were formalised by German physicist Ernst Abbe in 1873. The same effect can be seen in water ripples bending around a pier.

If the precision of optical components is tight enough to eliminate aberrations, and seeing conditions are good enough, imaging is “diffraction-limited.” With the advent of adaptive optics, dynamic mirrors and the like let observers remove aberrations from the sample as well as the optics. Originally developed to spy on dim satellites through a turbulent atmosphere, adaptive optics have recently been applied to microscopy to counteract the blurring effect of looking through tissue. If astronomy is like looking out from underwater, microscopy is like peering at the leaves at the bottom of a hot cuppa, complete with milk.

Even with the precise control afforded by adaptive optics the best possible resolution is still governed by the diffraction limit, about half the wavelength of light. In Leuwenhoek’s time the microbial world had previously been invisible. In the 20th century as well, the molecular machinery underpinning the cellular processes of life have been invisible, smeared by the diffraction limit into an irresolvable blur.

A human cell is typically on the order of about ten microns in diameter. The proteins, membranes, and DNA structure is organised at a level about one-thousandth as large, in the tens and hundreds of nanometres. In a conventional microscope, information at this scale is not retrievable thanks to diffraction, but it underlies all of life. Much of the mechanisms of disease operate at this level as well, and knowledge about how and why cells make mistakes has resounding implications for cancer and aging. In the past few decades physicists and microscopists have developed a number of techniques to go beyond the diffraction limit to measure the nanometric technology that makes life.

A number of techniques have been developed to surpass the diffraction barrier. The techniques vary widely in their use of some form or another of engineered illumination and/or engineered fluorescent proteins to make them work. The thing they have in common is computation: the computer has become as important of an optical component as a proper lens.

New instrumentation enables new measurements at the behest of human inquiry. Questions about biology at increasingly small spatial scales under increasingly challenging imaging contexts generate the need for higher precision techniques, in turn loosening a floodgate on previously unobtainable data. New data lead to new questions, and the cycle continues until it abuts the fundamental laws of physical nature. Before bacteria were discovered, it was impossible to imagine their role in illness and equally impossible to test it. Once the role was known, it became a simple intuitive leap for Alexander Fleming to suppose the growth inhibition of bacteria by fungi he saw in the lab might be useful as medicine. With the ability to see at the level of tens of nanometres, another world of invisible forces has been opened to human consideration and innovation. Scientists have already leaped one barrier at the diffraction limit. With no fundamental limit to human curiosity, let us consider current super-resolution techniques as the first of many triumphs in detection past limits of what is deemed possible.

Rubbish in, Garbage Out?

Extraordinary claims require extraordinary press releases?

You have probably read a headline in the past few weeks stating that NASA has verified that an infamous, seemingly reactionless propulsion drive does in fact produce force. You also might not have read the technical report that spurred the media frenzy (relative to the amount of press coverage normally allocated to space propulsion research, anyway), instead relying on the media reports and their own contracted expert opinion. The twist is that it seems to be the case that no one else- excepting perhaps the participants of the conference it was presented at– has read it either, and this includes myself and likely the authors of almost any other material you find commenting on it. The reason is that the associated entry in the NASA Technical Reports Server only consists of an abstract.

The current upswing of interest and associated speculation on the matter of this strange drive is eerily reminiscent of other recent \begin{sarcasm}groundbreaking discoveries\end{sarcasm}: FTL neutrinos measured by the OPERA experiment and the Arsenic Life bacterium from Mono Lake, California. Both were later refuted, some important people at OPERA ended up resigning, and the Arsenic Life paper continues to boost the impact factors of the authors and publisher as Science Magazine refuses to retract it. (current citations according to Google Scholar number more than 300).

I would venture that the manner of disclosing the OPERA findings was done more responsibly than the Arsenic Life paper. Although both research teams made use of press releases to gain a broad audience for their findings (note this down in your lab notebook as “do not do” if you are a researcher), the OPERA findings were at the pre-publication stage and disclosed as an invitation to greater scrutiny of their instrumentation, while the arsenic life strategy was much less reserved. From the OPERA press release:

The OPERA measurement is at odds with well-established laws of nature, though science frequently progresses by overthrowing the established paradigms. For this reason, many searches have been made for deviations from Einstein’s theory of relativity, so far not finding any such evidence. The strong constraints arising from these observations makes an interpretation of the OPERA measurement in terms of modification of Einstein’s theory unlikely, and give further strong reason to seek new independent measurements.

Notice the description of the search for exceptions to Einstein’s relativity as ” . . . so far not finding any evidence. . .” That despite the data they are reporting doing exactly that if anomalous instrumentation could be ruled out. This was a plea for help, not a claim of triumph.

On the contrary, the press seminar associated with the release of Felisa Wolfe-Simon et al.’s A bacterium that can grow by using arsenic instead of phosphorus issued no such caveats with their claims. Likewise it was readily apparent in the methods sections of their paper that the Arsenic Life team made no strong efforts to refute their own data (the principal aim of experimentation), and the review process at Science should probably have been made more rigorous than standard practice. It is perhaps repeated too often without consideration, but I’ll mention the late, great Carl Sagan’s assertion that “extraordinary claims require extraordinary evidence.” The OPERA team kept this in mind, while the Arsenic Life paper showed a strong preference to sweep under the carpet any due diligence in considering alternative explanations. Ultimately, the OPERA results were explained as an instrumentation error and the Arsenic Life discovery has been refuted in several independent follow-up experiments (i.e. [1][2]).

Is propellant-less propulsion on par with Arsenic Life or FTL neutrinos in terms of communicating findings? In this case I would lean toward the latter: more of a search for instrumentation error than a claim of the discovery of Totally New Physics. The title of the tech report “Anomalous Thrust Production from an RF Test Device Measured on a Low-Thrust Torsion Pendulum” denotes the minimum requisite dose of skepticism.

Background reading below, but by far the best take on the subject is xkcd number 1404. The alt-text: “I don’t understand the things you do, and you may therefore represent an interaction with the quantum vacuum virtual plasma.”

23/08/2014 several typos corrected
[UPDATE Full version of tech report: http://rghost.net/57230791%5D via comments from http://ow.ly/ADJqb .
http://www.wired.co.uk/news/archive/2014-07/31/nasa-validates-impossible-space-drive .
http://www.wired.co.uk/news/archive/2014-08/07/10-qs-about-nasa-impossible-drive .
http://www.wired.com/2014/08/why-nasas-physics-defying-space-engine-is-probably-bogus/?mbid=social_twitter .
http://en.wikipedia.org/wiki/Quantum_vacuum_plasma_thruster .
http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20140006052.pdf .
http://www.wired.co.uk/news/archive/2013-02/06/emdrive-and-cold-fusion .
http://www.aiaa.org/EventDetail.aspx?id=18582 .

Why is there no confidence in science journalism?

UncertainScienceStoriesMS1939Small

Living in the so-called anthropocene, meaningful participation in humanity’s trajectory requires scientific literacy. This requirement is a necessity at the population level, it is not enough for a small proportion of select individuals to develop this expertise, applying them only to the avenues of their own interest. Rather, a general understanding and use of the scientific method in forming actionable ideas for modern problems is a requisite for a public capable of steering policy along a survivable route. As an added benefit, scientific literacy produces a rarely avoided side-effect of knowing one or two things for certain, and touching upon the numinous of the universe.

Statistical literacy is a necessary foundation for building scientific literacy. Widespread confusion about the meaning of such terms as “statistical significance” (compounded by non-standard usage of the term “significance” on its own) abounds, resulting in little to no transferability of the import of these concepts when scientific results are described in mainstream publications. What’s worse, this results in a jaded public knowing just enough to twist the jargon of science to support their own predetermined, potentially dangerous, conclusions (e.g. because scientific theories can be refuted by evidence to the contrary, a given theory, no matter the level of support by existing data, can be ignored when forming personal and policy decisions).

I posit that a fair amount of the responsibility for improving the state of non-specialist scientific literacy lies with science journalists at all scales. The most popular science-branded media does little to nothing in imparting a sense of the scientific method, the context and contribution of published experiments, and the meaning of statistics underlying the claims. I suggest that a standardisation of language for describing scientific results is warranted, so that results and concepts can be communicated in an intuitive manner without resorting to condescension, as well as conferring the quantitative, comparable values used to form scientific conclusions.

A good place to start (though certainly not perfect) is the uncertainty guidance put out by the Intergovernmental Panel on Climate Change (IPCC). The IPCC reports benefit from translating statistical concepts of confidence and likelihood into intuitive terms without sacrificing the underlying quantitative meaning (mostly). In the IPCC AR5 report guidance on addressing uncertainty [pdf], likelihood statements of probability are standardised as follows:

LikelihoodIPCCguidance

In the fourth assessment report (AR4), the guidance [pdf] roughly calibrated confidence statements to a chance of being correct. I’ve written the guidance here in terms of p-values, or the chance that results are due to coincidence (p = 0.10 = 10% chance), but statistical tests producing other measurements of confidence were also covered.

confStandardAR4

The description of results via their confidence rather than statistical significance, which is normally used, is probably more intuitive to most people. Few people in general readership readily discern between statistical significance, i.e. the results are likely to not be due to chance, and meaningful significance, i.e. the results matter in some way. Likewise, statistical significance statements are not even very well established in scientific literature and vary widely by field. That being said, the IPCC’s AR4 guidance threshold for very high confidence is quite low. Many scientific results are only considered reportable at a p-value of less than 0.05, or 5% chance of being an experimental artifact in the data due to coincidence, whereas the AR4 guidance links a statement of very high confidence to anything with less than a 10% chance of being wrong. Likewise, a 5-in-10 chance of being correct hardly merits a statement of medium confidence in my opinion. Despite these limitations, I think the guidance should have been merely updated to better reflect the statistical reality of confidenceand it was a mistake for the guidance for AR5 to switch to purely qualitative standards for conveying confidence based on the table below, with highest confidence in the top right and lowest confidence in the bottom left.

confAR5Standards

Adoption (and adaptation) of standards like these in regular usage by journalist could do a lot to better the communication of science to a general readership. This would normalise field-variable technical jargon (e.g. sigma significance values in particle physics, p-values in biology) and reduce the need for daft analogies. Results described in this way would be amenable to meaningful comparison by generally interested but non-specialist audiences, while those with a little practice in statistics won’t be any less informed by dumbing-down the meaning.

Edited 2016/06/25 for a better title, added comic graphic. Source for file of cover design by Norman Saunders (Public Domain)
23 Aug. 2014: typo in first paragraph corrected:

. . . meaningful participation in participating in humanity’s trajectory. . .

References:

Michael D. Mastrandrea et al. Guidance Note for Lead Authors of the IPCC Fifth Assessment Report on Consistent Treatment of Uncertainties. IPCC Cross-Working Group Meeting on Consistent Treatment of Uncertainties. Jasper Ridge, CA, USA 6-7 July 2010. <http://www.ipcc.ch/pdf/supporting-material/uncertainty-guidance-note.pdf&gt;

IPCC. Guidance Notes for Lead Authors of the IPCC Fourth Assessment Report on Addressing Uncertainties. July 2005. <https://www.ipcc-wg1.unibe.ch/publications/supportingmaterial/uncertainty-guidance-note.pdf&gt;

Much ado about sitting

A few years ago, athletic shoe companies began to cash in on a study or two suggesting that running in shoes was dangerous, guaranteed to ruin your joints and your life, make you less attractive and confident, etc. (at least, that’s how it was translated to press coverage). The only viable answer, vested marketing implied, was to buy a new pair of shoes with less shoe in them.

Despite the obvious irony, consumers flocked in droves to purchase sweet new kicks and rectify their embarrassing running habits. Much like any other fitness craze, popular active-lifestyle magazines ran articles about the trend spinning a small amount of scientific research into definitive conclusions right next to advertisements for the shoes themselves. Fast forward to 2014 wherein the makers of arguably the most notorious shoes in the minimalist sector, the Vibram Five Fingers line, have moved to settle a lawsuit alleging the claimed health benefits of the shoes were not based on evidence. The market frenzy for minimalist footwear appears to have sharply abated. There are even blatant examples of market backlash in the introduction of what could be described as “marshmallow shoes,” such as the Hakko, with even more padding than runners were used to before the barefoot revolution.

An eerily similar phenomenon has appeared, i.e. market capitalisation on nascent scientific evidence, in the latest demon threatening our health: sitting. At the bottom of the orogenic marketplace for accessories designed to get workers a bit less semi-recumbent in the workplace. This market was virtually non-existent only a few years ago, yet now is substantial enough to have spawned an entire genre of internet article.

There is even a new term gaining traction for the condition: “sitting disease.” I sure hope it’s not catching. For now at least the term seems to remain quarantined in quotation marks most places it is used.

Many of the underlying articles in science journals are what is euphemistically referred to as survey science. Long generational time, lack of uniform cultivation standards and potential ethical considerations make Homo sapiens a rather poor model organism. Even if survey data were considered reliable (a dubious assumption), this only reveals associations. Even accelerometer studies, like those at the Mayo Clinic, only measure activity for a few weeks. The results can’t tell you that sitting alone causes obesity. An equally fair hypothesis would be that obesity increase the likelihood to stay sitting, but that’s just called inertia.

Although the studies and their press coverage motivate a burgeoning marketplace for NEAT accessories they don’t actually tell us much in the way of new information. A sedentary lifestyle is unhealthy. Attempts to increase the amount of low-intensity activity throughout the day, such as using a walking desk, are likely to motivate appetite. Without considering diet (and downplaying the importance of exercise), a standing desk, sitting ball, or occasional walking meeting is not likely to have tremendous health benefits when taken alone. And despite the rhetoric, maintaining a smoking habit to break up your sit-time with walks to the outdoors is probably not an equivalent trade-off. Presenting health management in such an unbalanced, single-variable way seems more motivated by trendiness for some, revenue for others, and both for the press. It is not that sitting is actually good for you, it’s just myopic to focus solely on that one health factor. As part of a a sedentary lifestyle gestalt, yes, it does play a role in promoting ill-health. Then again, if you think about it, you probably already knew that before it was cool.


Avoid sensationalist science journalism, consider the sources:
Ford, E.S., and Caspersen, C.J. (2012). Sedentary behaviour and cardiovascular disease: a review of prospective studies. Int J Epidemiol 41, 1338–1353.
Hamilton, M.T., Hamilton, D.G., and Zderic, T.W. (2007). Role of low energy expenditure and sitting in obesity, metabolic syndrome, type 2 diabetes, and cardiovascular disease. Diabetes 56, 2655–2667.
Katzmarzyk, P.T., Church, T.S., Craig, C.L., and Bouchard, C. (2009). Sitting time and mortality from all causes, cardiovascular disease, and cancer. Med Sci Sports Exerc 41, 998–1005.
Rosenkranz, R.R., Duncan, M.J., Rosenkranz, S.K., and Kolt, G.S. (2013). Active lifestyles related to excellent self-rated health and quality of life: cross sectional findings from 194,545 participants in The 45 and Up Study. BMC Public Health 13, 1071.
Rovniak, L.S., Denlinger, L., Duveneck, E., Sciamanna, C.N., Kong, L., Freivalds, A., and Ray, C.A. (2014). Feasibility of using a compact elliptical device to increase energy expenditure during sedentary activities. Journal of Science and Medicine in Sport 17, 376–380.
Schmid, D., and Leitzmann, M.F. (2014). Television Viewing and Time Spent Sedentary in Relation to Cancer Risk: A Meta-analysis. JNCI J Natl Cancer Inst 106, dju098.
Young, D.R., Reynolds, K., Sidell, M., Brar, S., Ghai, N.R., Sternfeld, B., Jacobsen, S.J., Slezak, J.M., Caan, B., and Quinn, V.P. (2014). Effects of Physical Activity and Sedentary Time on the Risk of Heart Failure. Circ Heart Fail 7, 21–27.

“Where is everybody?”

SONY DSC

Don’t get too excited about finding E.T. just yet. Get excited about the engineering.

A few days ago NASA had a press conference moderated by NASA Chief Scientist Ellen Stofan. The filtered headline that eventually made its way into the popular consciousness of the internet is that the discovery of extraterrestrial life is a paltry couple decades away. The way the conference was parsed into news form range from the relatively guarded “NASA scientists say they’re closer than ever to finding life beyond Earth” at the LA Times to the more sensational “NASA: ALIENS and NEW EARTHS will be ours inside 20 years” at the The Register. As statements, the former headline is almost unavoidably true given an assumption that humans eventually stumble upon life off-planet, and the latter is only one more over-capitalised word from being wholly fantastic. Neither actually touches on the content of the NASA press conference.

The impetus of the conference was partially fueled by an April announcement of a discovery by the Kepler program of the Earth-similar Kepler 186f, which happens to reside in the habitable zone of its siminymous parent star. Although Kepler 186f definitely might be sort of a bit more Earth-like, its discovery was only the latest in a long list of over 1800 exoplanets posited to exist to date. Although the techniques for exoplanet discovery planetary transit attributable stellar dimming, are not infalliable [paywalled primary source], the continued refinement of modern signal processing for unearthing (heh) exoplanet planet signatures makes this an exciting time to look skyward.

The speakers took a broad view of progression toward answering the question “are we alone?” John Grunsfeld, Hubble mechanic extraordinaire, emphasised the approach of looking for spectral signals corresponding to bio-signatures with the upcoming James Webb telescope. Of course, the terracentric focus shared by the panel means that NASA plans to look for signals associated with Earth life: water, methane, oxygen, etc. Carl Sagan et al. considered the task of finding similar biosignatures on Earth itself. Looking for signs we know to be associated with our own personal experience of life is our best current guess for what we should be looking for, but no guarantee exists that it is the right one. We are no longer too enthralled by the idea of trading arsenate for phosphate, but our own planet has plenty of examples of strange metabolism, that we should expect life off planet to consist of more peculiar possibilities. Imagine our chagrin if we spend a few centuries looking for spectral signatures of water before stumbling across hydrophobic biochemistry on Titan.

Many of us may remember the nanobe-laden Martian meteorite ALH84001 that touched off a burst of interest and a flurry of Mars probes in the latter half of the 1990s. Like the 100-200 nm fossilised “bacteria” in the Mars meteorite, the tone suggesting imminent discovery of extraterrestrial life (particularly the sensationalist coverage by the lay press) serves as nothing more than hyperbolic rhetoric. If this effect carries over to those with a hand on the purse-strings, so much the better, but don’t get too caught up as a member of the scientifically literate and generally curious public. The likelihood of finding life outside our own planet in a given time span is essentially impossible to predict with no priors, hence the famous Fermi’s paradox which graces the title of this post. The actual content of the video is much more important than the wanton speculation that fuels its press coverage.

A major advantage of placing the Hubble space telescope above the atmosphere was to avoid optical aberrations generated by atmospheric turbulence. The present state of the art in adaptive optics and signal processing essentially obviates this need, as ground-based telescopes such as the Magellan II in Chile can now outperform the Hubble in terms of resolution. The James Webb will offer some fundamentally novel capabilities in what it can see, with a 6.5m primary mirror and sensors sensitive to wavelengths from 600 nanometre red to the mid infrared at 28 microns.

The upcoming TESS survey, described by McArthur Fellow Sarah Seager, will use the same basic technique-observing planetary transits-as the Kepler mission to look for exoplanets. TESS will launch in 2017, slightly in advance of the main attraction of JWST. Looking for planetary transits has served us well in the past, but direct imaging is the holy grail. Seager described a starshade for occluding bright, planet-hosting stars to further that goal as part of the New Worlds mission. The design resembles a sunflower in pattern rather than a circular shade, the latter would introduce airy rings from diffaction around the edges, and desert tests of the prototypes have been encouraging so far. The precision engineering of the shade unfolding is another masterpiece. Due to its size, deployment cannot be tested in a terrestrial vacuum chamber, requiring its engineering to be all the more precise. I could see scale versions of the design as parasols doing quite well in the gift shop.

800px-Artist's_concept_of_the_New_Worlds_Observatory

Image from NASA via Wikipedia

The natural philosophy that we now call science has roots in the same fundamental questions as “regular” philosophy. “Are we alone?” Is really just a proxy for “Where are we, how does it work, and why are we here?” Without any definitive answers to these questions on the horizon, I think we can safely say that building the machines that allow us to explore them and conditioning our minds in order to think about our universe is a pretty good way to spend our time. It will be a lonely universe if we find ourselves to be a truly unique example of biogenesis, but not so lonely in the looking.

As for yours truly, I’m looking forward to the “Two Months of Terror” (to quote Grunsfeld), October-December 2018, as the James Webb telescope makes its way to the L2 Lagrange point to unfold and cool in preparation for a working life of precipitous discovery.

Link to video

Panel:
Ellen Stofan- Chief Scientist, NASA
John Grunsfeld- Astrophysicist, former astronaut, Hubble mechanic
Matt Mountain- Director: Space telescope Science Institute
John Mather- Project scientist James Webb telescope, 2006 physics Nobel laureate
Sarah Seager- Astrophysicist, MIT Principal Investigator, McArthuer fellow 2013
Dave Gallagher Electrical Engineer, Director of Astronomy and Physics at Jet Propulison Laboratory

Also read up on ESA projects: the Herschel Space Observatory, observing at 60 to 500 microns, and Gaia, a satellite set to use parallax to generate a precise galactic census.

Top image by the author

Good Seeing

1280px-Van_Gogh_-_Starry_Night_-_Google_Art_ProjectPLusAO

Long the purview of telescopes, the dynamic mirrors and wavefront engineering that enables astronomers to calm the night sky’s twinkle are now founding applications in biological microscopy as well. The techniques, termed adaptive optics, are leading to major improvents in the clarity and depth imaging capabilities of today’s microscopes.

The eyes may or may not be the windows to the soul, but our ocular world plays a central role in how our minds are built. Of all human senses, sight is the most influential to our worldview, the most relied-upon to ascertain the validity of our guesses about reality. Galileo’s observations showed us our place in the cosmos by providing the givens to test the conflicting ideas from Ptolemy and Aristotle against Copernicus and Kepler. Hooke, Leeuwenhoek, and Spinoza seeded the scientific landscape with observations that would provide an alternative to the commonly held belief that disease causes were malarial, that is a result of “bad air,” and preventable by applied fragrance. Even the word “cell,” the fundamental building block of life, was concocted as Robert Hooke viewed the organization of a slice of cork through the lens of a microscope.

Humans are lucky to live under a dense atmosphere, keeping us warm and respiring while it protects us from (most) meteors drawn to our gravity well. The downside is that Earthbound astronomers are like a swimmer watching a birthday party from the bottom of a pool, an assuredly poor choice of viewpoints. The dense, turbulent atmosphere of the Earth confounded observation of especially dim extraterrestrial objects, until a few decades ago when deformable mirrors were introduced to astronomical telescopes to counteract atmospheric aberration. Before adaptive optics, astronomers were limited in the tools available to counteract the atmosphere, and made do by building observation centres at high-altitudes and waiting for “good seeing” conditions to attenuate the blurring of active air. In contrast, a modern adaptive optics telescope can best the resolution of the famous Hubble space telescope, as in the case of the adaptive optics-enabled Magellan II located in Chile.
Astronomers have to look out through the thick soup of the atmosphere, but biological microscopists looking into tissues have even more challenges to contend with. Like trying to peer through a nice cup of milky chai, trying to look at living cells in vivo is a major hurdle to determining the nature of life intact and in action. This has led to the adaptation of the same techniques previously developed to take out the night’s twinkle for use in microscopy, now used to obviate the blur of microscopical imaging at depth.

The problem of imaging through a highly aberrating medium is experienced twice when imaging into tissues: once on the illumination side of the path and again as the signal leaves the sample. Optimising the amount of light that illuminates the desired depth and location and then successfully making it back to a point detector or image sensor determines the clarity and speed with which an image can be formed.

In microscope design there are three characteristics to optimise: temporal resolution (speed), spatial resolution, and depth (signal to noise). Improving one aspect of an instrument invariably leads to a decrease in another quality. Anton van Leuwenhoek made incredible observations using instruments made by carefully melting pulled strands of glass in a flame, and setting the resulting aspherical lenses in pinhole brass frames. The tiny, single lens instruments bear more resemblance to a magnifying glass than to a modern compound microscope. Using one was a matter of holding the entire instrument a few centimetres from the eye and squinting through a tiny aperture at the subject, generally illuminated by sunlight. Game changing innovations that lead the way to improve the overall capability of microscopy beyond zero-sum trade-offs in design optimization are few and far between. It is becoming increasingly clear as the technology matures that adaptive optics is fundamentally enabling for imaging tasks that were not possible with the instruments of a decade ago.

One realm in which adaptive optics is finding ready application is in optical imaging of the living brains of model organisms. Brain imaging was the same inspiration that led Marvin Minsky to invent the confocal microscope in the late 1950s. In confocal microscopy, a pinhole plate rejects the majority of light arising from out of focus areas as the microscope beam is scanned throughout the sample of interest. The pinhole plate ensures that out-of-focus light is rejected, but there is a fundamental limit to how much total optical power can be pumped into tissue before damage occurs. Thanks to the pinhole plate, improving confocal imaging with adaptive optics is straightforward as: any increase in the signal making it to the detector indicates a positive correction for aberrations. Therefore, optimising the dynamic elements of the microscope is a matter of producing the maximum signal.

psfSpiralGrad

Beam shaping and point-spread function engineering enable new imaging modalities in microscopy.

The application of adaptive optics to neuroimaging instills a strong sense of the cutting-edge, combining brain science with optical physics, but some areas adaptive optics has made a striking impact may seem are much more domestic. Researchers at Durham Univerity, UK employed adaptive optics confocal microscopy to measure the effects of temperature on the activity of cold-water lipases, enzymes that break down fat and grease. Enzyme names are one of the rare cases of scientific nomenclature being intuitive and informative, the name ‘lipase’ identifies the enzyme’s substrate as lipids, or fats, and the activity as cutting them apart is denoted by the suffix “-ase”. Much like a greasy fingerprint on a pair of sunglasses, the very presence of the substrate induces unwanted blurring amenable to correction with dynamic optical elements.

The state of the art is no longer limited to improving the precision and design of static optics. Dynamic elements allow the microscope and the microscopist to adapt to specific imaging situations, a task for which algorithms and image processing are essential. The computational brains of modern microscopes are integral components of the optical system, as essential as the lenses and mirrors that make up the physical hardware.

In pushing the limits of the types of scientific questions that can be addressed with light, there’s no requirement to generate a two-dimensional image. Rather, the data required to test a given hypothesis may exist as a three-dimensional construct, four-dimensional volume plus time, or even higher dimensionality. Although visualisation of data will continue to be important for science communication, the central role of the image in science may soon take a back seat to generalised, multidimensional data. Paralleling this shift, the next generation of light microscopes will also look radically different than our conventional expectations. The shift has already appeared in commercially available microscopes: the computer is so integral to the light sheet microscope made by German optics giant Zeiss, that the instrument does not have eyepieces. The microscopes we use tomorrow will resemble modern microscopes to the same extent as modern microscopes remind us of Leuwenhoek’s handlenses.

DSC_0129

In short order that shiny new confocal system may share the fate of this Leuwenhoek replica. This piece is part of the collection at the Oxford Museum of the History of Science.

Adaptive optics links:

http://www.nature.com/lsa/journal/v3/n4/fig_tab/lsa201446f3.html#figure-title

http://www.nature.com/lsa/journal/v3/n4/full/lsa201446a.html

http://www.iop.org/careers/working-life/articles/page_59193.html

Edit 2014/06/24: Fixed links

The Color Coded Tiers of Open Access

Open Access, or OA, in scientific publishing is bringing long-due attention to the question of availability. University libraries pay millions of dollars per year on subscriptions, sometimes under the influence of coercive package deals which encourage libraries to subscribe to a lump of journals rather than pick and choose the most relevant. Tim Gower, a fellow at Trinity College, Cambridge, reports that UK university libraries pay anywhere from £234,126 (Exeter) to £1,381,380 (University College London) in subscription costs to Elsevier alone. The excessive cost increases in journal subscriptions have led to substantial actions by some universities, including a cancellation of Elsevier subscriptions by Harvard, MIT avoiding a 3-year renewal commitment with Wiley and Elsevier, and selective cancellation of Elsevier journals by Cornell, to name a few.

The debate over the efficacy of the scientific publishing status quo is alive and well. By most counts the rate of retractions has increased, although it is not clear if more retractions are caused by more misconduct or better vigilance. eLife editor and Nobel Laureate Randy Schekman, among others, suspect the pressure to publish in superstar journals and over-reliance on impact factor leads to misplaced incentives and rewards showy failures. For example, the infamous “arsenic life” paper has amassed 287 citations as of this writing, as indexed by Google Scholar, and is unlikely to be retracted by Science as a result; the 287 references to the article could buoy an additional 8 articles, each with little to no citations, and still maintain Science’s impact factor of ~32.

So maybe you’ve become a bit frustrated with paywalls and the relative attention (and citation) monoply enjoyed by top-brand journals. Perhaps you are tired of your library paying exorbitant fees for bundled subscriptions. In any case, you’re considering pursuing open access for some of your work. It may be as simple as hosting PDFs of your articles on your own, but the options are diverse, as are the costs. OA is typically differentiated into two major types, designated by color: gold and green.

Green OA refers to self-hosting of copies by a person, lab, or university. These can be archived and made avaible as pre-prints, post-prints, or in the final, formatted version published by the journal. The latter method can be contentious with some publishers (see the recent spate at Nature over Duke University’s open access mandate). SHERPA/RoMEO further differentiates green OA friendliness of journals according to a range of colors according to what is allowed by a journal or publishers copyright transfer agreement.

  • green pre-print, post-print and publisher’s version
  • blue post-print and publisher’s version
  • yellow pre-print and publisher’s version
  • white not designated/not allowed

Gold OA is driven by the journal or publisher, rather than the author or university. These are the journals typically associated as open access, and they usually, but not always, charge a hefty fee to authors. Journals under the PLOS umbrella belong to this category, and big name publishers have been dipping their toes into gold open access as well.

A hybrid approach to publishing is becoming widespread. This is often implemented as making optional OA available at a few thousand dollars charged to the author, such as the policy employed by the Journal of Visual Experimentation or Optical Society publications. Other journals make the headline article for an issue freely available, often in advance of print publication, to draw interest. Many journals have explicit policies that OK green OA after a designated grace period, e.g. according to their policy Science allows free access to articles 12 months after initial publication.

OA has a role to play in the changing landscape of scientific publishing but there are still plenty of variations to be tried, and OA is no silver bullet for all that ails publication, funding, and promotion in science careers. Web resources such as figshare expand the role of data and figures, while online lab notebooks like OpenWetWare increase transparency. F1000 Research is experimenting with citeable, viewable, open peer review. OA won’t stop the occasional “arsenic life” paper from stealing headlines, but it definitely has a role to play in the future of access.

Additional OA resources:

The University of California Berkeley Library maintains an index of publishers with gold open access options and their associated publishing fees.

Duke University OA mandates versus Nature Publishing Group:
Duke Libraries take by Kevin Smith, JD: https://blogs.library.duke.edu/scholcomm/2014/03/27/attacking-academic-values/
Nature Publishing Group’s take by Grace Baynes http://blogs.nature.com/ofschemesandmemes/2014/03/28/clarifying-npgs-views-on-moral-rights-and-institutional-open-access-mandates

SHERPA/RoMEO. Provides shades of green to denote publisher’s OA archiving policies: http://www.sherpa.ac.uk/romeoinfo.html

Directory of Open Access Journals: http://doaj.org/

University of Colorado Denver Librarian Jeffrey Beall’s site: http://scholarlyoa.com/
Beall’s blog includes his list of potentially predatory publishers (http://scholarlyoa.com/publishers/), potentially predatory journals (http://scholarlyoa.com/individual-journals/), and the newer list of exploitative metric indexes (http://scholarlyoa.com/other-pages/misleading-metrics/). These are essential resources, particularly useful when conventional publishers conflate known exploitative publishers with OA as a whole.

A Phylogeny of Internet Journalism

While reading press coverage on the UW-Madison primate caloric restriction study for my essay, I kept getting deja vu as I noticed I was coming across the same language over and over. Much of this was due to the heavy reliance of early coverage on the press release from the University of Wisconsin-Madison, and sites buying stories from each other,and I decided it might be informative to make a phylogenetic tree of the coverage. To do so I used the text from the first two pages of google news results for “wisconsin monkey caloric restriction” and built a phylogenetic tree based on multiple sequence alignment after converting the english text to DNA sequences. I found a total of 27 articles on the CR study, and included one unrelated outgroup for a total of 28.

I used DNA Writer by Lensyl Urbano (CC BY NC SA) to convert the text of the article into a DNA sequence. This algorithm associates each character with a three nucleotide sequence, just like our own genome defines amino acids with a three letter code. Unlike our own genetic code, Urbano’s tool is not degenerate (each letter has only one corresponding 3 letter code). With base four (Adenine, Thymine, Guanine, and Cytosine provide our bases) there is room for 4^3 (64) unique codes. For example “I want to ride my bicycle” becomes

CTGAGCATGACTCTCTAGAGCTAGTGTAGCCACCTGTACCTAAGCACAGACAGCCATCTGTCAGACTCAATCCTA

The translation table and tool are available at http://earthsciweb.org/js/bio/dna-writer/.

To build the trees and alignments I used MAFFT. The sequences derived from each article can be relatively long, and MAFFT can handle longer sequences due to its use of the Fast Fourier Transform. MAFFT is available for download or use through a web interface here. I used the web interface, checking the Accurate and Minimum Linkage run options.

Once I had copied the tree in Nexus format, I ran FigTree by Andrew Rambaut to generate a useful graphical tree. I had included an unrelated article at Scientific American as an outgroup, and I chose the branch between that article and the group composed of press coverage of the UW macaque caloric restriction study as the root. This would correspond to a last common ancestor on a real phylogeny tree.

The resulting tree produces some interesting clades, for example ScienceDaily, esciencenews, and News-Medical, who essentially all just reproduced the UW-Madison press release, are grouped together. Another obvious group is the Tampa Bay Times and the Herald Tribune, which sourced the article from the New York Times and pared it down for their readers.

UWMacaqueCRPressTree

Here is the tree in Nexus format:

(((1_theScinder-:0.845,(((((((((((((((2_UWMPressRelease:0.0085,((4_escienceNews_UWM_:5.0E-4,5_ScienceDaily_UWPressRelease:5.0E-4):0.0,15_news-medical_UWM:5.0E-4):0.008):0.3115,26_aniNews:0.32):0.392,(14_natureWorldNews:0.7055,16_techTimes:0.7055):0.0065):0.006,25_expressUK:0.718):0.0025,20_hngn:0.7205):0.0195,(8_MedicalNewsToday:0.0,18_bayouBuzz_medicalNewsToday:0.0):0.74):0.0025,27_newsTonightAfrica:0.7425):0.047,(17_perezHilton:0.7805,(19_theVerge:0.6905,24_cbsLocalAtlanta:0.6905):0.09):0.009):0.0075,7_IFLS:0.797):0.007,21_seattlepi:0.804):0.006,12_nature:0.81):0.021,(6_yahooNews:0.0285,10_livescience:0.0285):0.8025):5.0E-4,((3_NYTimes:0.1875,11_HeraldTribune_NYT:0.1875):0.344,13_tampaBayTimes_NYT:0.5315):0.3):0.008,22_iol_dailyMail:0.8395):5.0E-4,9_healthDay/Philly_com:0.84):0.005):0.004,23_bbc:0.849):0.0245,28_OUTGROUPSciAmYeastyBeasties:0.8735);

. . .and this is a list of all the addresses for the articles I used and their labels on the tree: https://thescinder.com/pages/key-to-uwm-mac…logenetic-tree/

Come on you monkeys, do you want to live forever?

fatMonkey

Members of the control group for the Wisconsin National Primate Research Center caloric restriction study were fed an ad libitum diet of processed food.

The infinite monkey theorem, perhaps first invoked by French mathematician Émile Borel, posits that a monkey condemned to randomly punch keys on a typewriter for an infinite period of time would eventually produce the complete works of Shakespeare. The thought experiment may also be a good metaphor for encapsulating the experience of writing amateur science journalism.

Now consider the same experiment, replacing the generic monkey with members of the species Macaca mulatta, rhesus macaques, and the typewriter with as much processed food as the macaques can stuff into their furry little faces. Modestly pare down the timescale of the experiment from infinite time to about 25 years, increase the number of macaques from one lonely typist to about 38 individuals, and you have a pretty good first approximation of the control group for the University of Wisconsin-Madison Energy Metabolism and Chronic Disease study. You’ll be more familiar with the name used in the popular press, something including the words “caloric restriction,” “longevity” or “lifespan,” and “monkey.”

Caloric restriction (CR) has a long history of increasing longevity in yeast, nematodes, and mice. Youtube is full of mini-documentaries detailing the lives of the voluntarily emaciated, and many a blogger describes their day to day struggle to minimize caloric intake. The human caloric restriction community may have breathed a combined sigh of frustration and relief in 2012 when de facto rivals at the National Institutes of Aging (NIA), led by Dr. Rafa da Cabo, published an article contradicting the 2009 claim that it works in monkeys, too.

The most recent foray in the field of macaque CR published in Nature Communications by Dr. Ricki Colman et al. from Wisconsin, claims the NIA study control monkeys were actually on a CR diet as well, albeit less extreme than the 30% reduction of the experimental diet. They compared the mean weight of control monkeys in both studies to a national database of research macaque mass, the internet Primate Ageing Database or iPAD. The NIA controls were indeed as much as 15% lighter than the averages in the database, as would be expected if the animals were on a restricted diet. However, the UW controls were 5-10% heavier than average, blurring the line between normal feeding and overeating. iPAD does not distinguish between solitary or group housing in macaques, while both the NIA and the Wisconsin study house each individual separately.

The difference ultimately comes down to a discrepancy in what is considered a normal diet. Colman et al fed controls as much of a fortified, low-fat diet, relatively rich in sugar content, as they wanted. This ad libitum feeding was meant to mirror the eating habits of humans. At the NIA, controls wer given a diet based on estimated nutritional need, rather than appetite, and the food was less processed.

Since the goal of using primates in this research is to translate the results to humans, the differing diet choices for controls represent a meaningful philosophical difference: should we compare experiments to how we are or how we should be? Granted the industrialized world is now more overweight than not, and the control group studied by UW researchers may be a more realistic mirror of the human condition. But the survival benefits seen in the CR group may boil down to the benefits of eating a reasonable diet, avoiding excessive sugar and getting out of the cage once in a while. In short the UW study was designed in a way that would err on the side of confirming their hypothesis, while the NIA study was much more conducive to leaving room for the null alternative.

The controversy underlines the difficulty of taking promising results in “lower” animals and common model organisms and applying them to humans. The idea of putting 76 humans into controlled conditions for 25 years to test a radical diet or any other intervention is limited to the realm of the horror subtype of science fiction. This is why much of the health reports that trickle down into the popular press are based on “survey science,” in which respondents answer questionnaires regarding their diet and lifestyle, with varying degrees of quantitative oversight. This is in large part what leads to the impression that every other week the things that kill you are healthy again and vice-versa. It pays in terms of publicity for a university press office to encourage journalists to parrot a warning that eating meat is as deadly as smoking, even if human self-reporting is notoriously bad, and the underlying data may be a bit more subtle.

The climate for ethical considerations in even non-human primate research is evolving. In early 2013, the National Institutes of Health announced that they would begin retiring active chimpanzees from research with no intent to replace them. It is unlikely that either the experimental conditions for the NIA or the Wisconsin study will be reproduced in the near-future, so there won’t be any mulligans for CR in monkeys. This increases the scrutiny and standard of evidence for the results from these experiments, and makes it all the more important for the scientific community and popular press to come to cohesive conclusions.

The “need for consensus” may be overstated, as the studies are very different experiments. It is likely that those both scientifically literate and with the time and inclination to read the literature wouldn’t be misled in their conclusions, but this group will not include most people who may be affected by the outcome. After all, everyone gets old eventually, if they are lucky enough. The responsibility to avoid painting the situation as a sensational controversy and accurately convey the results of these experiments belongs to science journalists and academics in combination.

Caloric restriction reduces age-related and all-cause mortality in rhesus monkeys

Relevant articles (appended 2016/01/06):
Ricki J. Colman, T. Mark Beasley, Joseph W. Kemnitz, Sterling C. Johnson, Richard Weindruch & Rozalyn M. Anderson. Caloric restriction reduces age-related and all-cause mortality in rhesus monkeys. Nature Communications 5, Article number: 3557 doi:10.1038/ncomms4557
Received 12 October 2013 Accepted 05 March 2014 Published 01 April 2014

Evi M. Mercken, Bethany A. Carboneau, Susan M. Krzysik-Walker, and Rafael de Cabo.Of Mice and Men: The Benefits of Caloric Restriction, Exercise, and Mimetics Ageing Res Rev. 2012 Jul; 11(3): 390–398. Published online 2011 Dec 20. doi: 10.1016/j.arr.2011.11.005

Is the future of scientific publishing in-house open access?

WeirdFuture
Photo from flickr user Tom Marxchivist, 1952 cover by Basil Wolverton, used under CC attribution license.

Those of you that frequent theScinder know that I am pretty passionate about how science is disseminated, and you have probably noticed that, like their brethren in newsprint and magazine before them, the big-name publishers don’t know exactly how to react to a changing future, and despite what traditional publishers would have you believe, they are not immune to publishing tripe.

Nature may be butting heads with Duke University over requesting waivers for the open access policy in place there. Apparently the waiver request isn’t even necessarily based on the practical implementation of Duke’s open access policy (Nature allows articles to be made freely available in their final version 6 months after publication), but it does raise the question: how much hassle will universities and their faculty put up with before they take matters into their own hands? As MIT’s Samuel Gershman points out, modern publishing doesn’t cost all that much. Even the fairly exorbitant fees charged to authors by the “gold standard” of open access publishers may be a transient relic of the conventional (turning archaic?) publishing business model. This provides incentive for predatory publishing (as discussed in this article at The Scientist and the basis for the Bohannon article published in Science last October) But if peer review and editing is largely volunteer labour, performed as an essential component of the role of a researcher and with the bill largely footed as a public expenditure, why keep paying enormous subscription fees for traditional publishing? If the trend catches on, as it almost certainly will, leading institutions will continue to adopt open access policies and libraries will see less and less reason to keep paying for outdated subscriptions.

Relevant links:

Scholarly Publishing: Where is Plan B?

California univerisity system consider boycotting Nature Publishing Group

Samuel Gershman’s ideal publishing model, the Journal of Machine Learning Research