Philaephilia?

Philaephilia n. Temporary obsession with logistically important and risky stage of scientific endeavour and cometary rendezvous.

Don’t worry, the condition is entirely transient

Rivalling the 7 minutes of terror as NASA’s Curiosity rover entered the Martian atmosphere, Philae’s descent onto comet 67P/Churyumov-Gerasimenko Wednesday as part of the European Space Agency’s Rosetta mission had the world excited about space again.

Comets don’t have the classic appeal of planets like Mars. The high visibility of Mars missions and moon shots has roots in visions of a Mars covered in seasonal vegetation and full of sexy humans dressed in scraps of leather, and little else. But comets may be much better targets in terms of the scientific benefits. Comets are thought to have added water to early Earth, after the young sun had blasted the substance out to the far reaches of the solar system beyond the realm of the rocky planets. Of course, comets are also of interest for pure novelty: until Philae, humans had never put a machine down on a comet gently. Now the feat has been accomplished three times, albeit a bit awkwardly, with all science instruments surviving two slow bounces and an unplanned landing site. Unfortunate that Philae is limited to only 1.5 hours of sunlight per 12 hour day, but there is some possibility that a last-minute attitude adjustment may have arranged the solar panels a bit more fortuitously.

So if Rosetta’s Philae lander bounced twice, rather than grappling the surface as intended, and landed in a wayward orientation where its solar panels are limited to only 12.5% of nominal sun exposure, how is the mission considered a success?

Most likely, the full significance of the data relayed from Philae via Rosetta will take several months of analysis to uncover. Perhaps some of the experiments will be wholly inconclusive and observational, neither confirming nor denying hypotheses of characteristic structure of comets. For example, it seems unlikely that the MUPUS instrument (i.e. cosmic drill) managed to penetrate a meaningful distance into the comet, and we probably won’t gain much insight concerning the top layers of a comet beyond perhaps a centimetre or so. In contrast, CONSERT may yield unprecedented observations about the interior makeup of a comet.

In science, failures and negative findings are certainly more conclusive, and arguably more preferable, than so-called positive results, despite the selective pressure for the latter in science careers and the lay press. An exception disproves the rule, but a finding in agreement with theory merely “fails to negate” said theory. For example, we now know better than to use nitrocellulose as a vacuum propellant. Lesson learned on that front.

In addition to a something-divided-by-nothing fold increase in knowledge about the specific scenario of attempting a soft landing on a comet, I’d suggest we now know a bit more about the value of autonomy in expeditions where the beck-and-call from mission control to operations obviates real time feedback. Perhaps if Philae had been optimised for adaptability, it would have been able to maintain orientation to the comet surface and give Rosetta and scientists at home a better idea of its (final) resting place after detecting that the touchdown and grapple didn’t go through. Space science is necessarily cautious, but adaptive neural networks and other alternative avenues may prove useful in future missions.

I’ll eagerly await the aftermath, when the experimental and the telemetry data have been further analysed. The kind of space mission where a landing sequence can omit a major step and still have operational success of all scientific instruments on board is the kind of mission that space agencies should focus on. The Rosetta/Philae mission combined key elements of novelty (first soft landing and persistent orbiting of a comet) low cost (comparable to a fewspace shuttle missions), and robustness (grapples didn’t fire, comet bounced and got lost, science still occurred). Perhaps we’ll see continued ventures from international space agencies into novel, science-driven expeditions. Remember, the first scientist on the moon was on the (so far) final manned mission to Luna. Missions in the style of Rosetta may be more effective and valuable on all three of the above points, and are definitely more fundamental in terms of science achieved, than continuous returns to Mars and pushes for manned missions. In a perfect world where space agencies operate in a non-zero sum funding situation along with all the other major challenges faced by human society, we would pursue them all. But realistically, Philae has shown that not only do alternative missions potentially offer more for us to learn in terms ofscience and engineering, but can also enrapture the population in a transcendent endeavour. Don’t stop following the clever madness of humans pursuing their fundamental nature of exploring the universe they live in.

The advantages of parametric design

I work primarily in OpenSCAD when making designs for 3D printing (and 2D designs for lasercutting). This means that instead of a WYSIWYG interface based primarily on using the mouse, my designs are all scripted in a programming language that looks a lot like C. This might seem a bit more difficult at first (and it is certainly less than ideal for some situations) but it makes for a pretty simple way to generate repetitive structural elements in basic flow control, i.e. for loops. Even more important, it means that I can substantially change a design by modifying the variable values passed to a function (called modules in OpenSCAD). For the sake of an example, take Lieberkühn reflectors for macrophotography. Lieberkühn reflectors are a classic illumination technique that have mostly fallen out of style in favour of more modern illumination such as LED or fibre-based lighting, but remains quite elegant and offers a few unique advantages. I have been working with these in conjunction with a few different lenses, and mostly with the help of a macro bellows. The bellows makes for variable working distances as well as magnifications, so the focus of one Lieberkühn will be the most effective only within a narrow range of macro-bellows lengths. Parametric designs such as the ones I create and work with in OpenSCAD allow me to change attributes such as the nominal working distance without starting each design from scratch. For example:

LRWD35

35mm Lieberkühn focus

LRWD30

30mm Lieberkühn focus

LRWD25

25mm Lieberkühn focus

LRWD20

20mm Lieberkühn focus

This approach has proven highly useful for me in terms of both creating highly customisable design and iterating to get fit just right. I’ll post results of my latest exploration of Lieberkühn reflectors soon after I receive the latest realisation in Shapeways bronzed steel.

Have we really lost 52% of the world’s animals?

The methods used by the LPI should not be accepted without reservation

Turning a critical eye on the 2014 Living Planet Report.

WWF’s Living Planet Report (LPR) 2014 has been making headlines because of its alarming claim that population sizes of mammals, birds, reptiles, amphibians and fish have dropped by half since 1970. The report reached this stark (and widely shared) conclusion via the Living Planet Index (LPI) a “measure of the state of the world’s biological diversity based on population trends of vertebrate species from terrestrial, freshwater and marine habitats” developed by scientists at WWF and the Zoological Society of London (ZSL). The LPI was adopted by the Convention on Biological Diversity (CBD) as a progress indicator for its 2020 goal to “take effective and urgent action to halt the loss of biodiversity”, which sadly (but unsurprisingly) appears to be failing.

In the previous edition of the LPR published two years ago, the drop in vertebrate numbers was estimated to be 30%. Now the scientists behind the LPI claim to have improved the method, resulting in a much greater decrease (52%) than previously reported. But the methodology is still highly controversial.

The team estimated trends in 10,380 populations of 3,038 mammal, bird, reptile, amphibian and fish species using 2,337 data sources including published scientific literature, online databases, and grey literature. The data used in constructing the index are time series of either population size, density, abundance or a “proxy of abundance”, e.g. bird nest density when there were no bird counts available.

The collection and analyses of these data represent an enormous amount of work and the team responsible deserves praise for undertaking this huge project and for creating an urgent call to action for wildlife conservation. However, we need to bear in mind that this dramatic “halving” of the word’s vertebrates is a grotesque oversimplification of biodiversity loss. The diversity of data sources and types used, the variability in data quality, as well as the uncertainty behind many of the population trend estimates mean that the LPI is probably not very reliable.

Additionally, the 3,038 species included in the analyses represent only 4.8% of the world’s 62,839 described vertebrate species. (The report entirely omits invertebrates, which are often cornerstone species and vastly outnumber all vertebrate animals). Following criticism on the methodology of previous LPIs, this year the LPI team used the estimated number of species in different taxonomic groups and biogeographic areas to apply weightings to the data. This means that the population trend of a particular taxonomic group becomes more important if the group comprises a large number of species, whereas the population trend of a species-poor taxon is allocated considerably less weight. To illustrate this, let us consider fishes, which in the LPI analysis represent the largest proportion of vertebrate species in almost all biogeographic areas and therefore carry the most weight. My guess is that the fish species whose population trends are sufficiently documented to be included in the analysis are most often in serious decline, because well-studied species are usually those that are either overharvested or frequent victims of bycatch. Therefore, the negative fish trend contributed more to the final 52% figure than the decline of any other taxonomic group. Ironically, by trying to decrease error from taxonomic bias in available data, this method allows well-known species to drive the overall trend and does not deal with the problem of underrepresentation of less-studied species. Many of these less visible species, outside of human interest as food or pests, contribute substantially to overall biodiversity and ecosystem function.

Should we believe the shocking headlines? Have we really killed “half of the world’s animals”? Probably not. Conservationists hope that this type of dramatic statement will inspire action but the severity of the claim risks desensitising the public, achieving the opposite of its intended effect. Developing a clear picture of the degree of the threats humans pose to biodiversity is difficult, but imperfect knowledge is no excuse for negligence. We know for certain that we are driving species to extinction at an alarming rate and that this will have serious implications for the environment, economies, and human health. Is this knowledge really not sufficient to motivate urgent and meaningful conservation action?

Olivia Nater is a conservationist and biologist who is particularly fond of bees. Twitter @beeologist

How to win the Olympus Bioscapes photomicrography contest

_20140923_215021_2

All you need to win a $5,000 microscope is a $250,000 microscope

It is almost time to dust off your cover-image quality photomicrographs and enter the Olympus Bioscapes microscopy contest. According to the techniques used by contest winners since the contest’s inauguration in 2004, the best way to better your chances is to use a confocal microscope. A side-effect of inventing a technique that wins a Nobel Prize is that eventually it becomes run-of-the-mill, and “conventional” widefield fluorescence also makes a good showing. Biophotonics purists will find plenty to like as well: transmitted light microscopy is well represented in a smattering of techniques including differential interference contrast, Zernike phase, polarised light, Rheinberg illumination and Jamin-Lebedeff interference

image3301

Confocal may be at the top of the heap at the moment, but transmitted light technqiues continue to make strong appearances in stunning images among the top-ten places in Olympus bioscapes.

In a promising development, computational imaging techniques also find success in the contest. The broadly termed “computational optics” includes techniques such as structured illumination, in which the patterns in several images (rather uninspiring on their own) are combined to give a computed image with resolution just slightly better than the physically imposed law of diffraction. Also in this category is light sheet microscopy, which creates nice images on its own ( and has since the Ultramikroskop [pdf]from 1900), but is even better suited for combining many images to form a volume image. In my opinion, treating light as computable fields, equally amenable to processing in physical optics or electronics, is the enabling philosophy for the next deluge of discoveries to be made with biomicroscopy.

Compare the winningest techniques from the Olympus contest with those of the Nikon Small World contest below. Interestingly enough, confocal microscopy falls behind the simpler widefield fluorescence in the Nikon contest, and both have been bested throughout the history of the competition by polarised microscopy. Some of the differences in Olympus and Nikon contest winners may be due to the timing of technological breakthroughs. Bioscapes began in 2004, while Small World has been in operation since the late seventies. The vogue techniques and state-of-the-art have certainly evolved over the last four decades.

Nikon Small Wordl Winners

Seeing at Billionths and Billionths

Sketch10381924
This was my (unsuccessful) entry into last year’s Wellcome Trust Science Writing Prize. It is very similar to the post I published on the 24th of June, taking a slightly different bent on the same theme.

The Latin word skopein underlies the etymology of a set of instruments that laid the foundations for our modern understanding of nature: microscopes. References to the word are recognizable across language barriers thanks to the pervasive influence of the ancient language of scholars, and common usage gives us hints as to the meaning. We scope out a new situation, implying that we not only give a cursory glance but also take some measure or judgement.

Drops of glass held in brass enabled Robert Hooke and Anton van Leuwenhoek to make observations that gave rise to germ theory. Light microscopy unveiled our friends and foes in bacteria, replacing humours and miasmas as primary effectors driving human health and disease. The concept of miasmatic disease, a term that supposes disease is caused by tainted air, is now so far-fetched the term has been almost entirely lost to time. The bird-like masks worn by plague doctors were stuffed with potpourri: the thinking of the time was that fragrance alone could protect against the miasma of black death. The idea seems silly to us now, thanks to the fruits of our inquiry. The cells described by Hooke and “animalcules” seen by Leuwenhoek marked a transition from a world operated by invisible forces to one in which the mechanisms of nature were vulnerable to human scrutiny. In short, science was born in the backs of our eyes.

The ability of an observer using an optical instrument to differentiate between two objects has, until recently, been limited by the tendency of waves to bend at boundaries, a phenomenon known as diffraction. The limiting effects of diffraction were formalised by German physicist Ernst Abbe in 1873. The same effect can be seen in water ripples bending around a pier.

If the precision of optical components is tight enough to eliminate aberrations, and seeing conditions are good enough, imaging is “diffraction-limited.” With the advent of adaptive optics, dynamic mirrors and the like let observers remove aberrations from the sample as well as the optics. Originally developed to spy on dim satellites through a turbulent atmosphere, adaptive optics have recently been applied to microscopy to counteract the blurring effect of looking through tissue. If astronomy is like looking out from underwater, microscopy is like peering at the leaves at the bottom of a hot cuppa, complete with milk.

Even with the precise control afforded by adaptive optics the best possible resolution is still governed by the diffraction limit, about half the wavelength of light. In Leuwenhoek’s time the microbial world had previously been invisible. In the 20th century as well, the molecular machinery underpinning the cellular processes of life have been invisible, smeared by the diffraction limit into an irresolvable blur.

A human cell is typically on the order of about ten microns in diameter. The proteins, membranes, and DNA structure is organised at a level about one-thousandth as large, in the tens and hundreds of nanometres. In a conventional microscope, information at this scale is not retrievable thanks to diffraction, but it underlies all of life. Much of the mechanisms of disease operate at this level as well, and knowledge about how and why cells make mistakes has resounding implications for cancer and aging. In the past few decades physicists and microscopists have developed a number of techniques to go beyond the diffraction limit to measure the nanometric technology that makes life.

A number of techniques have been developed to surpass the diffraction barrier. The techniques vary widely in their use of some form or another of engineered illumination and/or engineered fluorescent proteins to make them work. The thing they have in common is computation: the computer has become as important of an optical component as a proper lens.

New instrumentation enables new measurements at the behest of human inquiry. Questions about biology at increasingly small spatial scales under increasingly challenging imaging contexts generate the need for higher precision techniques, in turn loosening a floodgate on previously unobtainable data. New data lead to new questions, and the cycle continues until it abuts the fundamental laws of physical nature. Before bacteria were discovered, it was impossible to imagine their role in illness and equally impossible to test it. Once the role was known, it became a simple intuitive leap for Alexander Fleming to suppose the growth inhibition of bacteria by fungi he saw in the lab might be useful as medicine. With the ability to see at the level of tens of nanometres, another world of invisible forces has been opened to human consideration and innovation. Scientists have already leaped one barrier at the diffraction limit. With no fundamental limit to human curiosity, let us consider current super-resolution techniques as the first of many triumphs in detection past limits of what is deemed possible.

Rubbish in, Garbage Out?

Extraordinary claims require extraordinary press releases?

You have probably read a headline in the past few weeks stating that NASA has verified that an infamous, seemingly reactionless propulsion drive does in fact produce force. You also might not have read the technical report that spurred the media frenzy (relative to the amount of press coverage normally allocated to space propulsion research, anyway), instead relying on the media reports and their own contracted expert opinion. The twist is that it seems to be the case that no one else- excepting perhaps the participants of the conference it was presented at– has read it either, and this includes myself and likely the authors of almost any other material you find commenting on it. The reason is that the associated entry in the NASA Technical Reports Server only consists of an abstract.

The current upswing of interest and associated speculation on the matter of this strange drive is eerily reminiscent of other recent \begin{sarcasm}groundbreaking discoveries\end{sarcasm}: FTL neutrinos measured by the OPERA experiment and the Arsenic Life bacterium from Mono Lake, California. Both were later refuted, some important people at OPERA ended up resigning, and the Arsenic Life paper continues to boost the impact factors of the authors and publisher as Science Magazine refuses to retract it. (current citations according to Google Scholar number more than 300).

I would venture that the manner of disclosing the OPERA findings was done more responsibly than the Arsenic Life paper. Although both research teams made use of press releases to gain a broad audience for their findings (note this down in your lab notebook as “do not do” if you are a researcher), the OPERA findings were at the pre-publication stage and disclosed as an invitation to greater scrutiny of their instrumentation, while the arsenic life strategy was much less reserved. From the OPERA press release:

The OPERA measurement is at odds with well-established laws of nature, though science frequently progresses by overthrowing the established paradigms. For this reason, many searches have been made for deviations from Einstein’s theory of relativity, so far not finding any such evidence. The strong constraints arising from these observations makes an interpretation of the OPERA measurement in terms of modification of Einstein’s theory unlikely, and give further strong reason to seek new independent measurements.

Notice the description of the search for exceptions to Einstein’s relativity as ” . . . so far not finding any evidence. . .” That despite the data they are reporting doing exactly that if anomalous instrumentation could be ruled out. This was a plea for help, not a claim of triumph.

On the contrary, the press seminar associated with the release of Felisa Wolfe-Simon et al.’s A bacterium that can grow by using arsenic instead of phosphorus issued no such caveats with their claims. Likewise it was readily apparent in the methods sections of their paper that the Arsenic Life team made no strong efforts to refute their own data (the principal aim of experimentation), and the review process at Science should probably have been made more rigorous than standard practice. It is perhaps repeated too often without consideration, but I’ll mention the late, great Carl Sagan’s assertion that “extraordinary claims require extraordinary evidence.” The OPERA team kept this in mind, while the Arsenic Life paper showed a strong preference to sweep under the carpet any due diligence in considering alternative explanations. Ultimately, the OPERA results were explained as an instrumentation error and the Arsenic Life discovery has been refuted in several independent follow-up experiments (i.e. [1][2]).

Is propellant-less propulsion on par with Arsenic Life or FTL neutrinos in terms of communicating findings? In this case I would lean toward the latter: more of a search for instrumentation error than a claim of the discovery of Totally New Physics. The title of the tech report “Anomalous Thrust Production from an RF Test Device Measured on a Low-Thrust Torsion Pendulum” denotes the minimum requisite dose of skepticism.

Background reading below, but by far the best take on the subject is xkcd number 1404. The alt-text: “I don’t understand the things you do, and you may therefore represent an interaction with the quantum vacuum virtual plasma.”

23/08/2014 several typos corrected
[UPDATE Full version of tech report: http://rghost.net/57230791%5D via comments from http://ow.ly/ADJqb .
http://www.wired.co.uk/news/archive/2014-07/31/nasa-validates-impossible-space-drive .
http://www.wired.co.uk/news/archive/2014-08/07/10-qs-about-nasa-impossible-drive .
http://www.wired.com/2014/08/why-nasas-physics-defying-space-engine-is-probably-bogus/?mbid=social_twitter .
http://en.wikipedia.org/wiki/Quantum_vacuum_plasma_thruster .
http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20140006052.pdf .
http://www.wired.co.uk/news/archive/2013-02/06/emdrive-and-cold-fusion .
http://www.aiaa.org/EventDetail.aspx?id=18582 .

Much ado about sitting

A few years ago, athletic shoe companies began to cash in on a study or two suggesting that running in shoes was dangerous, guaranteed to ruin your joints and your life, make you less attractive and confident, etc. (at least, that’s how it was translated to press coverage). The only viable answer, vested marketing implied, was to buy a new pair of shoes with less shoe in them.

Despite the obvious irony, consumers flocked in droves to purchase sweet new kicks and rectify their embarrassing running habits. Much like any other fitness craze, popular active-lifestyle magazines ran articles about the trend spinning a small amount of scientific research into definitive conclusions right next to advertisements for the shoes themselves. Fast forward to 2014 wherein the makers of arguably the most notorious shoes in the minimalist sector, the Vibram Five Fingers line, have moved to settle a lawsuit alleging the claimed health benefits of the shoes were not based on evidence. The market frenzy for minimalist footwear appears to have sharply abated. There are even blatant examples of market backlash in the introduction of what could be described as “marshmallow shoes,” such as the Hakko, with even more padding than runners were used to before the barefoot revolution.

An eerily similar phenomenon has appeared, i.e. market capitalisation on nascent scientific evidence, in the latest demon threatening our health: sitting. At the bottom of the orogenic marketplace for accessories designed to get workers a bit less semi-recumbent in the workplace. This market was virtually non-existent only a few years ago, yet now is substantial enough to have spawned an entire genre of internet article.

There is even a new term gaining traction for the condition: “sitting disease.” I sure hope it’s not catching. For now at least the term seems to remain quarantined in quotation marks most places it is used.

Many of the underlying articles in science journals are what is euphemistically referred to as survey science. Long generational time, lack of uniform cultivation standards and potential ethical considerations make Homo sapiens a rather poor model organism. Even if survey data were considered reliable (a dubious assumption), this only reveals associations. Even accelerometer studies, like those at the Mayo Clinic, only measure activity for a few weeks. The results can’t tell you that sitting alone causes obesity. An equally fair hypothesis would be that obesity increase the likelihood to stay sitting, but that’s just called inertia.

Although the studies and their press coverage motivate a burgeoning marketplace for NEAT accessories they don’t actually tell us much in the way of new information. A sedentary lifestyle is unhealthy. Attempts to increase the amount of low-intensity activity throughout the day, such as using a walking desk, are likely to motivate appetite. Without considering diet (and downplaying the importance of exercise), a standing desk, sitting ball, or occasional walking meeting is not likely to have tremendous health benefits when taken alone. And despite the rhetoric, maintaining a smoking habit to break up your sit-time with walks to the outdoors is probably not an equivalent trade-off. Presenting health management in such an unbalanced, single-variable way seems more motivated by trendiness for some, revenue for others, and both for the press. It is not that sitting is actually good for you, it’s just myopic to focus solely on that one health factor. As part of a a sedentary lifestyle gestalt, yes, it does play a role in promoting ill-health. Then again, if you think about it, you probably already knew that before it was cool.


Avoid sensationalist science journalism, consider the sources:
Ford, E.S., and Caspersen, C.J. (2012). Sedentary behaviour and cardiovascular disease: a review of prospective studies. Int J Epidemiol 41, 1338–1353.
Hamilton, M.T., Hamilton, D.G., and Zderic, T.W. (2007). Role of low energy expenditure and sitting in obesity, metabolic syndrome, type 2 diabetes, and cardiovascular disease. Diabetes 56, 2655–2667.
Katzmarzyk, P.T., Church, T.S., Craig, C.L., and Bouchard, C. (2009). Sitting time and mortality from all causes, cardiovascular disease, and cancer. Med Sci Sports Exerc 41, 998–1005.
Rosenkranz, R.R., Duncan, M.J., Rosenkranz, S.K., and Kolt, G.S. (2013). Active lifestyles related to excellent self-rated health and quality of life: cross sectional findings from 194,545 participants in The 45 and Up Study. BMC Public Health 13, 1071.
Rovniak, L.S., Denlinger, L., Duveneck, E., Sciamanna, C.N., Kong, L., Freivalds, A., and Ray, C.A. (2014). Feasibility of using a compact elliptical device to increase energy expenditure during sedentary activities. Journal of Science and Medicine in Sport 17, 376–380.
Schmid, D., and Leitzmann, M.F. (2014). Television Viewing and Time Spent Sedentary in Relation to Cancer Risk: A Meta-analysis. JNCI J Natl Cancer Inst 106, dju098.
Young, D.R., Reynolds, K., Sidell, M., Brar, S., Ghai, N.R., Sternfeld, B., Jacobsen, S.J., Slezak, J.M., Caan, B., and Quinn, V.P. (2014). Effects of Physical Activity and Sedentary Time on the Risk of Heart Failure. Circ Heart Fail 7, 21–27.

“Where is everybody?”

SONY DSC

Don’t get too excited about finding E.T. just yet. Get excited about the engineering.

A few days ago NASA had a press conference moderated by NASA Chief Scientist Ellen Stofan. The filtered headline that eventually made its way into the popular consciousness of the internet is that the discovery of extraterrestrial life is a paltry couple decades away. The way the conference was parsed into news form range from the relatively guarded “NASA scientists say they’re closer than ever to finding life beyond Earth” at the LA Times to the more sensational “NASA: ALIENS and NEW EARTHS will be ours inside 20 years” at the The Register. As statements, the former headline is almost unavoidably true given an assumption that humans eventually stumble upon life off-planet, and the latter is only one more over-capitalised word from being wholly fantastic. Neither actually touches on the content of the NASA press conference.

The impetus of the conference was partially fueled by an April announcement of a discovery by the Kepler program of the Earth-similar Kepler 186f, which happens to reside in the habitable zone of its siminymous parent star. Although Kepler 186f definitely might be sort of a bit more Earth-like, its discovery was only the latest in a long list of over 1800 exoplanets posited to exist to date. Although the techniques for exoplanet discovery planetary transit attributable stellar dimming, are not infalliable [paywalled primary source], the continued refinement of modern signal processing for unearthing (heh) exoplanet planet signatures makes this an exciting time to look skyward.

The speakers took a broad view of progression toward answering the question “are we alone?” John Grunsfeld, Hubble mechanic extraordinaire, emphasised the approach of looking for spectral signals corresponding to bio-signatures with the upcoming James Webb telescope. Of course, the terracentric focus shared by the panel means that NASA plans to look for signals associated with Earth life: water, methane, oxygen, etc. Carl Sagan et al. considered the task of finding similar biosignatures on Earth itself. Looking for signs we know to be associated with our own personal experience of life is our best current guess for what we should be looking for, but no guarantee exists that it is the right one. We are no longer too enthralled by the idea of trading arsenate for phosphate, but our own planet has plenty of examples of strange metabolism, that we should expect life off planet to consist of more peculiar possibilities. Imagine our chagrin if we spend a few centuries looking for spectral signatures of water before stumbling across hydrophobic biochemistry on Titan.

Many of us may remember the nanobe-laden Martian meteorite ALH84001 that touched off a burst of interest and a flurry of Mars probes in the latter half of the 1990s. Like the 100-200 nm fossilised “bacteria” in the Mars meteorite, the tone suggesting imminent discovery of extraterrestrial life (particularly the sensationalist coverage by the lay press) serves as nothing more than hyperbolic rhetoric. If this effect carries over to those with a hand on the purse-strings, so much the better, but don’t get too caught up as a member of the scientifically literate and generally curious public. The likelihood of finding life outside our own planet in a given time span is essentially impossible to predict with no priors, hence the famous Fermi’s paradox which graces the title of this post. The actual content of the video is much more important than the wanton speculation that fuels its press coverage.

A major advantage of placing the Hubble space telescope above the atmosphere was to avoid optical aberrations generated by atmospheric turbulence. The present state of the art in adaptive optics and signal processing essentially obviates this need, as ground-based telescopes such as the Magellan II in Chile can now outperform the Hubble in terms of resolution. The James Webb will offer some fundamentally novel capabilities in what it can see, with a 6.5m primary mirror and sensors sensitive to wavelengths from 600 nanometre red to the mid infrared at 28 microns.

The upcoming TESS survey, described by McArthur Fellow Sarah Seager, will use the same basic technique-observing planetary transits-as the Kepler mission to look for exoplanets. TESS will launch in 2017, slightly in advance of the main attraction of JWST. Looking for planetary transits has served us well in the past, but direct imaging is the holy grail. Seager described a starshade for occluding bright, planet-hosting stars to further that goal as part of the New Worlds mission. The design resembles a sunflower in pattern rather than a circular shade, the latter would introduce airy rings from diffaction around the edges, and desert tests of the prototypes have been encouraging so far. The precision engineering of the shade unfolding is another masterpiece. Due to its size, deployment cannot be tested in a terrestrial vacuum chamber, requiring its engineering to be all the more precise. I could see scale versions of the design as parasols doing quite well in the gift shop.

800px-Artist's_concept_of_the_New_Worlds_Observatory

Image from NASA via Wikipedia

The natural philosophy that we now call science has roots in the same fundamental questions as “regular” philosophy. “Are we alone?” Is really just a proxy for “Where are we, how does it work, and why are we here?” Without any definitive answers to these questions on the horizon, I think we can safely say that building the machines that allow us to explore them and conditioning our minds in order to think about our universe is a pretty good way to spend our time. It will be a lonely universe if we find ourselves to be a truly unique example of biogenesis, but not so lonely in the looking.

As for yours truly, I’m looking forward to the “Two Months of Terror” (to quote Grunsfeld), October-December 2018, as the James Webb telescope makes its way to the L2 Lagrange point to unfold and cool in preparation for a working life of precipitous discovery.

Link to video

Panel:
Ellen Stofan- Chief Scientist, NASA
John Grunsfeld- Astrophysicist, former astronaut, Hubble mechanic
Matt Mountain- Director: Space telescope Science Institute
John Mather- Project scientist James Webb telescope, 2006 physics Nobel laureate
Sarah Seager- Astrophysicist, MIT Principal Investigator, McArthuer fellow 2013
Dave Gallagher Electrical Engineer, Director of Astronomy and Physics at Jet Propulison Laboratory

Also read up on ESA projects: the Herschel Space Observatory, observing at 60 to 500 microns, and Gaia, a satellite set to use parallax to generate a precise galactic census.

Top image by the author