Structured illumination: the Bruce Wayne of the super-resolution league?


A week before Eric Betzig shared the Nobel Prize with Stefan Hell and William Moerner for super-resolution fluorescence microscopy, I listened to him give a talk at an imaging conference in Edinburgh, Scotland. The talk focused on Structured Illumination Microscopy (SIM). The idea that SIM does not belong in the same category as STED and super-localisation techniques, Betzig repeatedly stressed, is ludicrous. Betzig is so convinced of this notion that his group has moved to focus on developing applications of SIM for live imaging.

The best image resolution obtained by SIM is only about twice as good as that imposed by the normal diffraction limit, paling in comparison to the hundred times improvement sometimes seen by STED, but SIM is faster and runs on a more efficient light budget than the rest of the super-resolution stable. This creates non-trivial advantages when the subject is alive and preferred to stay that way. Biologists can learn a lot from studying something which is formerly alive, but much more from cells in the dynamic travails of life.

If Betzig is convinced working with SIM is more amenable to practical application than other super-resolution techniques, such as Photo-Activated Localisation Microscopy, the technique that won him the Nobel Prize, why was it left out when it came time for the Swedish Academy of Sciences to recognise super-resolution? The answer may lie more in the rules and peculiarities surrounding the awarding of a Nobel than on the scientific relevance and impact, but you wouldn’t guess that from reading the Scientific Background on the Nobel Prize in Chemistry 2014.

In the published view of the Kungl Vedenskapsakademien, SIM, “Although stretching Abbe’s limit of resolution,” remains ”confined by its prescriptions.” In other words, the enhancement beyond the diffraction limit achieved by structured illumination is just not super enough. In principle the resolution of STED can be improved without limit by switching your depletion laser from “stun” to “kill” (i.e. increasing the depletion intensity). Likewise, super-localisation is essentially a matter of taking a large number of images of blinking fluorescent tags. Improving the effective resolution in super-localisation is a case of tuning the chemistry of your fluorescence molecules and taking an enormous amount of images. In reality, practical problems prevent further resolution improvement long before the capabilities of these techniques reach the resolution of a Heisenberg’s microscope, for example. However, SIM is subject to an “aliasing limit,” which, for the nonce, seems to be as hard and fast as Abbe’s and Rayleigh’s resolution criteria were (and largely still are, with the exception of fluorescence techniques) for the last hundred years.

As a rule with only one exception I know about, a Nobel Prize is not awarded post-mortem. Despite the justification proffered in the official background, Mats Gustaffson’s untimely death in 2011 may have played a major role in the exclusion of super-resolution structured illumination microscopy. Combined with the cap of three people sharing a single Prize, this left Rainer Heintzmann and the late Mats Gustaffson without Nobel recognition of their contributions to super-resolution. Even with the somewhat arbitrary adjudication over what it is to truly “break” the diffraction limit, it seems curious that one of the super-res laureates has moved almost entirely away from the prize-winning technique he invented, preferring instead the under-appreciated SIM. The Nobel Prize is arguably the penultimate distinction in scientific endeavor, and it seems beneath the station of the prize for its issue to be governed so strictly by arbitrary statutes. Then again, the true reward of scientific achievement is not a piece of gold and a pile of kronor, but the achievement itself. The universe isn’t altered whether you win the Nobel Prize for uncovering one of its little secrets, the truth of the secret will remain regardless.

‘Anonymous’ has an intriguing comment about why super-resolution is still not finding common use in biology research here.

Interesting note: unlike STED and PALM/PAINT/STORM, structured illumination can be applied to quantitative phase imaging

The original version of the bell .svg file is from:


How to win the Olympus Bioscapes photomicrography contest


All you need to win a $5,000 microscope is a $250,000 microscope

It is almost time to dust off your cover-image quality photomicrographs and enter the Olympus Bioscapes microscopy contest. According to the techniques used by contest winners since the contest’s inauguration in 2004, the best way to better your chances is to use a confocal microscope. A side-effect of inventing a technique that wins a Nobel Prize is that eventually it becomes run-of-the-mill, and “conventional” widefield fluorescence also makes a good showing. Biophotonics purists will find plenty to like as well: transmitted light microscopy is well represented in a smattering of techniques including differential interference contrast, Zernike phase, polarised light, Rheinberg illumination and Jamin-Lebedeff interference


Confocal may be at the top of the heap at the moment, but transmitted light technqiues continue to make strong appearances in stunning images among the top-ten places in Olympus bioscapes.

In a promising development, computational imaging techniques also find success in the contest. The broadly termed “computational optics” includes techniques such as structured illumination, in which the patterns in several images (rather uninspiring on their own) are combined to give a computed image with resolution just slightly better than the physically imposed law of diffraction. Also in this category is light sheet microscopy, which creates nice images on its own ( and has since the Ultramikroskop [pdf]from 1900), but is even better suited for combining many images to form a volume image. In my opinion, treating light as computable fields, equally amenable to processing in physical optics or electronics, is the enabling philosophy for the next deluge of discoveries to be made with biomicroscopy.

Compare the winningest techniques from the Olympus contest with those of the Nikon Small World contest below. Interestingly enough, confocal microscopy falls behind the simpler widefield fluorescence in the Nikon contest, and both have been bested throughout the history of the competition by polarised microscopy. Some of the differences in Olympus and Nikon contest winners may be due to the timing of technological breakthroughs. Bioscapes began in 2004, while Small World has been in operation since the late seventies. The vogue techniques and state-of-the-art have certainly evolved over the last four decades.

Nikon Small Wordl Winners

Seeing at Billionths and Billionths

This was my (unsuccessful) entry into last year’s Wellcome Trust Science Writing Prize. It is very similar to the post I published on the 24th of June, taking a slightly different bent on the same theme.

The Latin word skopein underlies the etymology of a set of instruments that laid the foundations for our modern understanding of nature: microscopes. References to the word are recognizable across language barriers thanks to the pervasive influence of the ancient language of scholars, and common usage gives us hints as to the meaning. We scope out a new situation, implying that we not only give a cursory glance but also take some measure or judgement.

Drops of glass held in brass enabled Robert Hooke and Anton van Leuwenhoek to make observations that gave rise to germ theory. Light microscopy unveiled our friends and foes in bacteria, replacing humours and miasmas as primary effectors driving human health and disease. The concept of miasmatic disease, a term that supposes disease is caused by tainted air, is now so far-fetched the term has been almost entirely lost to time. The bird-like masks worn by plague doctors were stuffed with potpourri: the thinking of the time was that fragrance alone could protect against the miasma of black death. The idea seems silly to us now, thanks to the fruits of our inquiry. The cells described by Hooke and “animalcules” seen by Leuwenhoek marked a transition from a world operated by invisible forces to one in which the mechanisms of nature were vulnerable to human scrutiny. In short, science was born in the backs of our eyes.

The ability of an observer using an optical instrument to differentiate between two objects has, until recently, been limited by the tendency of waves to bend at boundaries, a phenomenon known as diffraction. The limiting effects of diffraction were formalised by German physicist Ernst Abbe in 1873. The same effect can be seen in water ripples bending around a pier.

If the precision of optical components is tight enough to eliminate aberrations, and seeing conditions are good enough, imaging is “diffraction-limited.” With the advent of adaptive optics, dynamic mirrors and the like let observers remove aberrations from the sample as well as the optics. Originally developed to spy on dim satellites through a turbulent atmosphere, adaptive optics have recently been applied to microscopy to counteract the blurring effect of looking through tissue. If astronomy is like looking out from underwater, microscopy is like peering at the leaves at the bottom of a hot cuppa, complete with milk.

Even with the precise control afforded by adaptive optics the best possible resolution is still governed by the diffraction limit, about half the wavelength of light. In Leuwenhoek’s time the microbial world had previously been invisible. In the 20th century as well, the molecular machinery underpinning the cellular processes of life have been invisible, smeared by the diffraction limit into an irresolvable blur.

A human cell is typically on the order of about ten microns in diameter. The proteins, membranes, and DNA structure is organised at a level about one-thousandth as large, in the tens and hundreds of nanometres. In a conventional microscope, information at this scale is not retrievable thanks to diffraction, but it underlies all of life. Much of the mechanisms of disease operate at this level as well, and knowledge about how and why cells make mistakes has resounding implications for cancer and aging. In the past few decades physicists and microscopists have developed a number of techniques to go beyond the diffraction limit to measure the nanometric technology that makes life.

A number of techniques have been developed to surpass the diffraction barrier. The techniques vary widely in their use of some form or another of engineered illumination and/or engineered fluorescent proteins to make them work. The thing they have in common is computation: the computer has become as important of an optical component as a proper lens.

New instrumentation enables new measurements at the behest of human inquiry. Questions about biology at increasingly small spatial scales under increasingly challenging imaging contexts generate the need for higher precision techniques, in turn loosening a floodgate on previously unobtainable data. New data lead to new questions, and the cycle continues until it abuts the fundamental laws of physical nature. Before bacteria were discovered, it was impossible to imagine their role in illness and equally impossible to test it. Once the role was known, it became a simple intuitive leap for Alexander Fleming to suppose the growth inhibition of bacteria by fungi he saw in the lab might be useful as medicine. With the ability to see at the level of tens of nanometres, another world of invisible forces has been opened to human consideration and innovation. Scientists have already leaped one barrier at the diffraction limit. With no fundamental limit to human curiosity, let us consider current super-resolution techniques as the first of many triumphs in detection past limits of what is deemed possible.


optimisationsTriangle Ever find yourself wishing for the last microscope you will ever need to buy, the instrument that can view anything at any scale and any speed? It’s very tempting to imagine an optical microscope with the diffraction-unlimited resolution of STED, the volumetric imaging speed of light sheet illumination, the deep-tissue penetration of multiphoton microscopy, and the ability to do it all in phase and scattering without invoking a need for exogenous fluorophores or dyes. Perhaps the gamma-ray microscope employed in a Heisenberg thought experiment or the tricorder from Star Trek would come close, but unfortunately we are still waiting on the underlying technologies to mature for the latter. In microscopy as in life, optimisation in one capability comes at a trade-off cost in another. Put more plainly, TANSTAAFL.

Earlier in the summer I attended a biophotonics summer school at the University of Illinois at Urbana-Champaign’s Beckman Institute (link). At the end of a combination of lab tours, seminar-style lectures and poster sessions, we were treated to an hour-long presentation by the president of Carl Zeiss, James Sharp. Perhaps you have heard of Zeiss, a company eponymous with its founder, who teamed up with Ernst Abbe in the late 1800s to invent and commercialise the field of microscopy. After painting a stark contrast of the present job market with that of days past with a story of being stuck in his interviewer’s office for a day by a locked filing cabinet and errant bell-bottom pants, Sharp went on to give what essentially amounted to a 45 minute advertisement for Zeiss (spoiler, they are not best friends with Leica) as a company to work for or buy things from. It was an insightful set of slides that emphasised how far I have to go in my own career before I could fathom spending half a million dollars on a microscope. One insight that I came away with that will stick with me for the foreseeable future is the imaging optimisation triangle.

Sharp described the triangle as a trade-off for traits of resolution, speed, and depth, but the concept is fairly common and the third trait is often defined by the signal to noise ratio, or sensitivity. The moral of the story is that all three corners of the triangle can’t be optimised simultaneously. All else being equal, STED can’t be as fast as wide field or light sheet imaging, and nothing can penetrate tissue like 4-photon imaging. Step changes in the underlying technology can raise the watershed for performance across microscope modalities, e.g. new sensor paradigms can improve signal-to-noise regardless of the technique used. However, even with marked leaps in innovation, you can’t have it all at once.

The microscopy triangle is typically invoked as a qualitative example of trade-offs. However, the three traits certainly have measurable performance features and three corners equates easily enough to three axes. Why not populate a quantitative volume to show the pros and cons of various imaging modalities? Here are a few flagship microscope techniques populated on the quantitative microscopy TANSTAAFL pyramid.


These are all vastly different techniques, so the minutiae of their strengths is somewhat lost. Given a known volume occupying desirable specifications for testing a hypothesis, the graph could be populated with the techniques at your disposal and used to inform a decision on which to utilise. More realistically, axes can be added as need (e.g. for photobleaching, axial resolution) and a single or set of similar techniques could be considered with different settings, e.g. laser power, sensor used, etc., rather than comparing these vastly different modalities.

Values are approximate and from the following sources:
Multiphoton depth penetration extimated from a talk by Chris Xu of Cornell University
Wide-field: Personal estimates

The triangle is known by various names including, imaging triangle or (somewhat ominously) as the eternal triangle, or even the “triangle of frustration.”

Update 2014/07/09: Typo: “Start Trek” corrected to “Star Trek”

Good Seeing


Long the purview of telescopes, the dynamic mirrors and wavefront engineering that enables astronomers to calm the night sky’s twinkle are now founding applications in biological microscopy as well. The techniques, termed adaptive optics, are leading to major improvents in the clarity and depth imaging capabilities of today’s microscopes.

The eyes may or may not be the windows to the soul, but our ocular world plays a central role in how our minds are built. Of all human senses, sight is the most influential to our worldview, the most relied-upon to ascertain the validity of our guesses about reality. Galileo’s observations showed us our place in the cosmos by providing the givens to test the conflicting ideas from Ptolemy and Aristotle against Copernicus and Kepler. Hooke, Leeuwenhoek, and Spinoza seeded the scientific landscape with observations that would provide an alternative to the commonly held belief that disease causes were malarial, that is a result of “bad air,” and preventable by applied fragrance. Even the word “cell,” the fundamental building block of life, was concocted as Robert Hooke viewed the organization of a slice of cork through the lens of a microscope.

Humans are lucky to live under a dense atmosphere, keeping us warm and respiring while it protects us from (most) meteors drawn to our gravity well. The downside is that Earthbound astronomers are like a swimmer watching a birthday party from the bottom of a pool, an assuredly poor choice of viewpoints. The dense, turbulent atmosphere of the Earth confounded observation of especially dim extraterrestrial objects, until a few decades ago when deformable mirrors were introduced to astronomical telescopes to counteract atmospheric aberration. Before adaptive optics, astronomers were limited in the tools available to counteract the atmosphere, and made do by building observation centres at high-altitudes and waiting for “good seeing” conditions to attenuate the blurring of active air. In contrast, a modern adaptive optics telescope can best the resolution of the famous Hubble space telescope, as in the case of the adaptive optics-enabled Magellan II located in Chile.
Astronomers have to look out through the thick soup of the atmosphere, but biological microscopists looking into tissues have even more challenges to contend with. Like trying to peer through a nice cup of milky chai, trying to look at living cells in vivo is a major hurdle to determining the nature of life intact and in action. This has led to the adaptation of the same techniques previously developed to take out the night’s twinkle for use in microscopy, now used to obviate the blur of microscopical imaging at depth.

The problem of imaging through a highly aberrating medium is experienced twice when imaging into tissues: once on the illumination side of the path and again as the signal leaves the sample. Optimising the amount of light that illuminates the desired depth and location and then successfully making it back to a point detector or image sensor determines the clarity and speed with which an image can be formed.

In microscope design there are three characteristics to optimise: temporal resolution (speed), spatial resolution, and depth (signal to noise). Improving one aspect of an instrument invariably leads to a decrease in another quality. Anton van Leuwenhoek made incredible observations using instruments made by carefully melting pulled strands of glass in a flame, and setting the resulting aspherical lenses in pinhole brass frames. The tiny, single lens instruments bear more resemblance to a magnifying glass than to a modern compound microscope. Using one was a matter of holding the entire instrument a few centimetres from the eye and squinting through a tiny aperture at the subject, generally illuminated by sunlight. Game changing innovations that lead the way to improve the overall capability of microscopy beyond zero-sum trade-offs in design optimization are few and far between. It is becoming increasingly clear as the technology matures that adaptive optics is fundamentally enabling for imaging tasks that were not possible with the instruments of a decade ago.

One realm in which adaptive optics is finding ready application is in optical imaging of the living brains of model organisms. Brain imaging was the same inspiration that led Marvin Minsky to invent the confocal microscope in the late 1950s. In confocal microscopy, a pinhole plate rejects the majority of light arising from out of focus areas as the microscope beam is scanned throughout the sample of interest. The pinhole plate ensures that out-of-focus light is rejected, but there is a fundamental limit to how much total optical power can be pumped into tissue before damage occurs. Thanks to the pinhole plate, improving confocal imaging with adaptive optics is straightforward as: any increase in the signal making it to the detector indicates a positive correction for aberrations. Therefore, optimising the dynamic elements of the microscope is a matter of producing the maximum signal.


Beam shaping and point-spread function engineering enable new imaging modalities in microscopy.

The application of adaptive optics to neuroimaging instills a strong sense of the cutting-edge, combining brain science with optical physics, but some areas adaptive optics has made a striking impact may seem are much more domestic. Researchers at Durham Univerity, UK employed adaptive optics confocal microscopy to measure the effects of temperature on the activity of cold-water lipases, enzymes that break down fat and grease. Enzyme names are one of the rare cases of scientific nomenclature being intuitive and informative, the name ‘lipase’ identifies the enzyme’s substrate as lipids, or fats, and the activity as cutting them apart is denoted by the suffix “-ase”. Much like a greasy fingerprint on a pair of sunglasses, the very presence of the substrate induces unwanted blurring amenable to correction with dynamic optical elements.

The state of the art is no longer limited to improving the precision and design of static optics. Dynamic elements allow the microscope and the microscopist to adapt to specific imaging situations, a task for which algorithms and image processing are essential. The computational brains of modern microscopes are integral components of the optical system, as essential as the lenses and mirrors that make up the physical hardware.

In pushing the limits of the types of scientific questions that can be addressed with light, there’s no requirement to generate a two-dimensional image. Rather, the data required to test a given hypothesis may exist as a three-dimensional construct, four-dimensional volume plus time, or even higher dimensionality. Although visualisation of data will continue to be important for science communication, the central role of the image in science may soon take a back seat to generalised, multidimensional data. Paralleling this shift, the next generation of light microscopes will also look radically different than our conventional expectations. The shift has already appeared in commercially available microscopes: the computer is so integral to the light sheet microscope made by German optics giant Zeiss, that the instrument does not have eyepieces. The microscopes we use tomorrow will resemble modern microscopes to the same extent as modern microscopes remind us of Leuwenhoek’s handlenses.


In short order that shiny new confocal system may share the fate of this Leuwenhoek replica. This piece is part of the collection at the Oxford Museum of the History of Science.

Adaptive optics links:

Edit 2014/06/24: Fixed links

How to win the Nikon Small World photomicrography competition

The deadline for the Nikon Small World photomicrography competition is fast approaching (April 30th), and I’ve parsed some data on what types of images tend to win over since the contest’s inception in the late 1970s. The graphs below include data from both the stills and the newly minted video competition.


Figure 1: The total number of images utilizing each technique for places 1-20, Honorable Mentions, and Images of Distinction.

Right away we see that polarized light techniques have a distinct advantage in terms of how often we see them on the winners podium. This was a bit of a surprise. I’m always left with the impression of a preponderance of confocal images after each year’s announcement of winners, but I suppose confocal would have not been seeing much use until the 80s.


Figure 2: Heat map of the total number of images from 1st to 20th place.

Polarized light still easily dominates the field, with fluorescence and confocal making strong showings (you’ll notice many of the technique categories for NSW are overlapping). Techniques grouped under fluorescence do have a slightly higher number of 1st place finishes at 9 versus 8, and of total top 5 finishes (41 vs. 40). Beyond the top 5, polarized light has essentially more placers at every position.

Good luck to everyone who enters. I don’t have the rights to display my favorites from previous contests (e.g. this, this, or this), but I will display a few of my own, non-winner, images.


Freshwater ostracod


Freshwater copepod (cyclops)