Seeing at Billionths and Billionths

This was my (unsuccessful) entry into last year’s Wellcome Trust Science Writing Prize. It is very similar to the post I published on the 24th of June, taking a slightly different bent on the same theme.

The Latin word skopein underlies the etymology of a set of instruments that laid the foundations for our modern understanding of nature: microscopes. References to the word are recognizable across language barriers thanks to the pervasive influence of the ancient language of scholars, and common usage gives us hints as to the meaning. We scope out a new situation, implying that we not only give a cursory glance but also take some measure or judgement.

Drops of glass held in brass enabled Robert Hooke and Anton van Leuwenhoek to make observations that gave rise to germ theory. Light microscopy unveiled our friends and foes in bacteria, replacing humours and miasmas as primary effectors driving human health and disease. The concept of miasmatic disease, a term that supposes disease is caused by tainted air, is now so far-fetched the term has been almost entirely lost to time. The bird-like masks worn by plague doctors were stuffed with potpourri: the thinking of the time was that fragrance alone could protect against the miasma of black death. The idea seems silly to us now, thanks to the fruits of our inquiry. The cells described by Hooke and “animalcules” seen by Leuwenhoek marked a transition from a world operated by invisible forces to one in which the mechanisms of nature were vulnerable to human scrutiny. In short, science was born in the backs of our eyes.

The ability of an observer using an optical instrument to differentiate between two objects has, until recently, been limited by the tendency of waves to bend at boundaries, a phenomenon known as diffraction. The limiting effects of diffraction were formalised by German physicist Ernst Abbe in 1873. The same effect can be seen in water ripples bending around a pier.

If the precision of optical components is tight enough to eliminate aberrations, and seeing conditions are good enough, imaging is “diffraction-limited.” With the advent of adaptive optics, dynamic mirrors and the like let observers remove aberrations from the sample as well as the optics. Originally developed to spy on dim satellites through a turbulent atmosphere, adaptive optics have recently been applied to microscopy to counteract the blurring effect of looking through tissue. If astronomy is like looking out from underwater, microscopy is like peering at the leaves at the bottom of a hot cuppa, complete with milk.

Even with the precise control afforded by adaptive optics the best possible resolution is still governed by the diffraction limit, about half the wavelength of light. In Leuwenhoek’s time the microbial world had previously been invisible. In the 20th century as well, the molecular machinery underpinning the cellular processes of life have been invisible, smeared by the diffraction limit into an irresolvable blur.

A human cell is typically on the order of about ten microns in diameter. The proteins, membranes, and DNA structure is organised at a level about one-thousandth as large, in the tens and hundreds of nanometres. In a conventional microscope, information at this scale is not retrievable thanks to diffraction, but it underlies all of life. Much of the mechanisms of disease operate at this level as well, and knowledge about how and why cells make mistakes has resounding implications for cancer and aging. In the past few decades physicists and microscopists have developed a number of techniques to go beyond the diffraction limit to measure the nanometric technology that makes life.

A number of techniques have been developed to surpass the diffraction barrier. The techniques vary widely in their use of some form or another of engineered illumination and/or engineered fluorescent proteins to make them work. The thing they have in common is computation: the computer has become as important of an optical component as a proper lens.

New instrumentation enables new measurements at the behest of human inquiry. Questions about biology at increasingly small spatial scales under increasingly challenging imaging contexts generate the need for higher precision techniques, in turn loosening a floodgate on previously unobtainable data. New data lead to new questions, and the cycle continues until it abuts the fundamental laws of physical nature. Before bacteria were discovered, it was impossible to imagine their role in illness and equally impossible to test it. Once the role was known, it became a simple intuitive leap for Alexander Fleming to suppose the growth inhibition of bacteria by fungi he saw in the lab might be useful as medicine. With the ability to see at the level of tens of nanometres, another world of invisible forces has been opened to human consideration and innovation. Scientists have already leaped one barrier at the diffraction limit. With no fundamental limit to human curiosity, let us consider current super-resolution techniques as the first of many triumphs in detection past limits of what is deemed possible.

Does STED really break the law?


Ernst Abbe, legendary German physicist of yore, defined the resolution of a microscope as a function of the ratio of the objective aperture radius to the working distance of the image object. Meaning that the closer the image object (the shorter the focal length) and the wider the objective lens is, the greater the resolution, all else being equal. It’s commonly referred to as “Abbe’s diffraction limit”-a fundamental, physical limit to our ability to form images with lenses. Wavelength plays into it too, and that’s why the longer wavelengths used in say, two-photon microscopy can only resolve different light sources over a longer distance.

I remember sitting in a neurology seminar last year, the topic was the imaging capabilities of STED (STimulated Emission Depletion) confocal microscopy-a superresolution technique-for living neurons. Immediately prior to the section where the method was described (a doughnut-shaped point-spread function photobleaches nearby fluorophores, preventing them from contributing to the intensity measurement at a given point) was a cartoon involving a police officer and some sort of criminal hijinks taking place. The title of the slide was something along the lines of “How to break the law.”

As one audience member was quick to point out, the fluorescence emission (from the middle of the doughnut-shaped stimulated emission PSF) still must travel back through the objective and all the other optics, convolving with the various transfer functions of the lens elements at every step of the way. How can this be said to be breaking the diffraction limit? I agree.

Because a confocal microscope is a scanning apparatus, it only illuminates and records a single point at a time. The illumination path has a point spread function just as in the imaging path, so normally a fairly large volume, capable of containing many fluorophores, is illuminated and contributes to the light brought to focus by the imaging path. By bleaching a large volume around surrounding the region of interest, the only fluorophores capable of excitation by the non-toroidal excitation PSF reside in this central volume. Although the PSF incident on the photodetector is still diffraction limited, the only fluorophores contributing to the signal are in that small, central, un-depleted volume.

I’m not convinced that this corresponds to superresolution, at least not without qualification. It certainly allows you to superlocalize the fluorophores, but resolution in microscopy is differentiating between two proximal light sources in space and, I think this is crucial, in time. So there is a time scale involved in STED, and all scanning methods, for which two objects can be resolved, as long as their movements are much slower than the scan. A highly dynamic process like cell membrane fluctuations would probably fall outside of the processes that could be imaged by STED. I think a more suitable term would be appropriate for this and similarly limited techniques.

I don’t make this claim simply because I’m catching my grumps early this year. I think the terminology is genuinely misleading. Consider the title of this paper: “Three-dimensional, single-molecule fluorescence imaging beyond the diffraction limit by using a double-helix point spread function” [1].
They demonstrate an interesting method for recovering depth in a widefield microscope, and they do pinpoint a dense population of fluorophores, some of which are only a few nanometres apart. But the fluorophores are effectively immobilized, and the entire imaging processing takes 450 seconds.
“Resolution” and “sensitivity” are two very different things, and under certain constraints you can use one to inform the other. The title in this case misleads the reader to assume that physical laws are being broken, which is not the case. In particular this mild misdirection will lead undergraduates and laypeople to misinterpret the claims of science. We should take efforts to avoid it, even if your rival is pulling in grant money with these sorts of “impossible” claims. After all, I’m sure that Ernst Abbe wouldn’t buy it.

[1]S.R.P. Pavani, M.A. Thompson, J.S. Biteen, S.J. Lord, N. Liu, R.J.Twieg, R. Piestun, W.E. Moerner. Three-dimensional, single-molecule fluorescence imaging beyond the diffraction limit by using a double-helix point spread function. Proc. Nat. Ac. Sci. 106:9. (2009)

Note: The 3D localization of dense fluorophores they report, again by multiple rounds of partial photoactivation and photobleaching, was generated over 30 cycles of 30 exposures, each frame consisting of a 500 ms exposure. That’s over five minutes of acquisition time.