­A Skeptic Over Coffee – Young Blood

dsc_0005

A tragic tale of a star-crossed pair,
science vs. a journalist’s flare

When reporting on scientific topics, particularly when describing individual papers, how important is it for the popular coverage to have anything to do with the source material? Let’s take a look at a recent science paper from Justin Rebo and others in Nature Communications and the accompanying coverage by Claire Maldarelli at Popular Science

Interest in parabiosis has increased recently due to coverage of scientific papers describing promising results in mice and the high-profile of some parabiosis enthusiasts. Parabiosis, from the Latin for “living beside”, typically has involved stitching two mice together. After a few days the fused tissue provides blood exchange through a network of newly formed capillaries.

The most recent investigation into the healing effects of youthful blood exchange from Rebo et al. expands the equipment list used for blood exchange beyond the old technique of duct-taping two animals together surgically joining two animals. Instead of relying on the animals to grow new capillary beds for blood exchange to occur, the authors of the new paper used a small pump to exchange a few drops of blood at a time until both mice had approximately the same proportion of their own blood and that of a donor and vice-versa.

According to the coverage from Popular Science:

While infusing blood from a younger mouse into an older mouse had no effect on the elderly mouse in the latest study, infusing blood from an older mouse into a younger one caused a host of problems in organs and other tissues.

Just a few paragraphs further Maldarelli quotes Conboy (last author on the paper) as saying “‘This study tells us that young blood, by itself, cannot work as medicine’.” In contrast, in the paper the authors state that “Importantly, our work on rodent blood exchange establishes that blood age has virtually immediate effects on regeneration of all three germ layer derivatives.” and later that “. . . extracorporeal blood manipulation provides a modality of rapid translation to human clinical intervention.”[1] There seems to be a bit of disagreement between the version of Conboy on the author list of the scientific article and the version of Conboy quoted in the PopSci coverage of the same article.

We also learned from Maldarelli that the tests reported in the paper were performed a month after completing the blood exchange procedure, but the longest duration from blood exchange to the experiment’s end (sacrifice for post-mortem tissue analysis) was 6 days after blood exchange.

I came across the PopSci coverage when it appeared on a meta-news site that highlights popular web articles, so it’s safe to assume I wasn’t the first to read it. Shouldn’t the coverage of scientific articles reported in the lay press have more in common with the source material than just buzzwords? The science wasn’t strictly cut and dried: not every marker or metric responded in the same way to the old/young blood exchange, and while I agree that we shouldn’t be encouraging anyone to build a blood-exchange rejuvenation pod in their garage, the findings of the article fell a long way from the conclusions reported in the lay-article: that young blood had no effect on the physiology of old mice. This is to say nothing about the quality of the paper itself and the confidence we should assign to the experimental results in the first place: with 12 mice total* and a p-value cutoff of 0.05 (1 out of every 20 experiments will appear significant at random), I’d take the original results with a grain of salt as well.

This is the face of science we show the public, and it’s unreliable. It is no easy task for journalists to accurately report and interpret scientific research. Deadlines are tight, and writers face competition and pressure from cheap amateur blogs and regurgitation feeds. “What can I do to help?” you ask. As a consumer of information you can demand scientific literacy in the science news you consume. Ask for writers to convey confidence and probability in a consisent way that can be understood and compared to other results by non-specialists. As a bare minimum, science and the press that covers it should at least have more in common than the latest brand of esoteric jargon.

If we only pay attention to the most outlandish scientific results, then most scientific results will be outlandish.

*The methods describe a purchase of 6 old and 6 young mice. However, elsewhere in the paper the groups are said to contain 8 mice each. Thus it is not clear how many mice in total were used in these experiments, and how they managed to create 12 blood exchange pairings for both control and experimental groups without re-using the same mice.

[1] Rebo, J. et al. A single heterochronic blood exchange reveals rapid inhibition of multiple tissues by old blood. Nat. Commun. 7, 13363 doi: 10.1038/ncomms13363 (2016).

A month on Mars

The year is 2035, and the new space race is well underway.

Jeffrey Aussat straightened his back under the Martian sun. He stretched as he leaned onto the handle of his space-shovel, raising his hand to wipe the sweat from his tired brow. Of course this made him feel stupid, as it had every time since they landed. His clumsy hand, gloved up and looking for all the world just like the hand of Gozer the Destructor, stops short as it meets the clear glass of his visor. Jeff curses himself at the unavoidable fact that, despite nearly a (Mars) month since they arrived on the spaceship Clever Reference, he still couldn’t get used to the simplest things. Like the need to have this damn fish-bowl on every time he goes outside.

Jeff curses himself again as his shovel snaps in half. Losing focus during retrospection and self-pity, he somehow must have applied an off-axis load onto the carbon fiber handle. A few moments respite for his weary, microgravity-weakened bones, had turned into disaster. On Mars, the gravity may be slight but the days sure are long, but they don’t tell you that in the brochure.

Jeff now found himself up a recurring slope lineae without a planetary-protection cleared drill bit. Jeff and his partner had started out their ‘stead with 32 shovels, and in just a few weeks every single on had fallen prey to some combination of user error and catastrophic failure. Every building in their inflatable homestead creation kit was designed to be placed underground, damping temperature swings and blocking some of the deadly radiation pouring down on Mars surface. Specifically, the buildings needed to have a huge amount of ground piled on top of them to keep the humans alive, and without a working shovel they couldn’t move regolith quickly enough to make their new home habitable. Due to some shady logistics, they wouldn’t receive their “mule”- a heavy lifting robot- until the next colonization flotilla arrived, roughly two years on.

Jeff holds the transmit button on his radio as he slumps down in the shade of his space-wheelbarrow, half-piled high with regolith and also made from carbon fiber. “Becky, I think we have a problem,” he said.

After a short intermission of static, Becky replied with a sigh, “You’ve got a leak in your suit again, don’t you?” Getting used to the strange Martian gravity after playing zero-G ping pong for three months, Jeff had often ended up tumbling down to hands and knees during the first weeks of their stay, a stress the suits were well-designed to withstand. Repeated joint flexion of the suit fabric with embedded Martian dust, however, rapidly opened up a community of near-microscopic pinholes that were almost impossible to find and patch.

“No, not this time. It’s the shovel.”

“The last shovel?”

Jeff paused. “… Yeah.” This was bad. They would have to resort to much less efficient regolith maneuvering techniques, working only at night and sleeping under the raw materials in the shed to limit radiation exposure. After the recurring problem with clumsiness-induced suit leaks, Becky’s patience was sure to be running out on him. The trip over had already placed enough stress on their relationship. “Is the 3D printer working yet? Maybe we can print a new one, or print a repair splint for one of the frayed shovel shafts.”

Silence followed for nearly a minute. She was either checking the printer status or seriously considering filing flight plans to leave. “I’m afraid the printer’s still down. The print nozzle was damaged during the last maintenance test.”

“Oh.” Jeff replied. He didn’t finish converting the thought running through his head to speech: so we’re screwed then.

“No problem. I’ll order a fresh crate from Amazon.”

“What?” This was either a joke, a hoax, or lifesaving news.

“Check your email. They’ve opened up a new distribution center on Phobos. Bezos built it up and staffed it without telling anybody.”

“You’ve got to be kidding me.”

“No joke. I need a few extra items to qualify for free shipping, do you need anything?”

“I’m sure we can think of something. I’ll return to the compound with the regolith I’ve collected and we can run an inventory.” Jeff tossed the broken shovel on top of the regolith in the enormous wheelbarrow. The designers had figured that, if everything on Mars would weigh so much less than on Earth, all the tools should be designed to be that much larger. The result was a suite of construction and farming tools that were cartoonishly two and a half times too large when fully assembled. As Jeff wheeled the barrow around to face the glint from the compound’s solar panels, he felt his mood pick up. They were going to be OK after all.

“There’s something else going on that’s a bit weird.” Becky said.

Jeff skipped a step, catching himself on the wheelbarrow handles to prevent impregnating the knees on his suit with more abrasive dust. “What is it?” he asked.

“You remember that huge rover from 2020?”

Jeff made a vague confirmatory noise “Uh . . . the Scrutiny, was it?”

“Yeah, that’s the one. It’s attacking the water scavenging plant.”

“What? Why? I thought that thing was supposed to be retired by now, parked somewhere near Jezero delta?”

“Well it’s here, and it’s pushing the water plant over. The LEDs are putting out some sort of morse code, I’m still trying to figure it out.” Becky explained.

“How long until it damages the water plant?” Jeff inquired.

“At this rate, probably a couple of weeks. They didn’t move very fast back then.”

Jeff felt the spring return to his step. Two weeks was enough time to contact the mission controllers to get some help debugging the rovers strange behavior. As he realized the problem was tractable, the physical sensation of a weight lifted from his shoulders. Also, the motility assist systems on his suit had finally finished calibrating.

“Too bad they didn’t set up the distro center in time for Mars One.” Jeff joked

“Too soon, Jeff, that’s not funny.” Becky said coldly.

The Mars One mission had ended in a tragicomic maelstrom of cannibalism and incidental lyophilization. The cameras, intended to live-broadcast the travails of the crew around the clock, were among the last systems still running on the capsule. Although the sponsors had long disavowed any relationship to the mission, anyone with a standard transceiver and a darkly morbid curiosity could ping the ship and tune in to the dismal situation. A series of planned challenges/mission planning fiascos ultimately meant they never got onto the correct Mars rendezvous trajectory. In their current orbit, apoapsis would never quite reach Mars orbit, nor would periapsis ever bring them close enough for an earthly recapture. Ironically, what remained of the crew and craft would probably outlast them all. The perfectly preserved astronauts would remained unchanged for millennia in their wayward but stable orbit, like confused Pharaohs circling the portal to the netherworld.

A skeptic over coffee: sick of lab meetings

rhinovirus

This post brought to you by a dedicated community of human Rhinovirus ( pdb model 1AYM).

Imagine the following dialogue between researchers:

Wayne the Brain: “Third one this week ::Cough:: I am literally sick of lab meetings.”
Wankdorf: “Oh I feel ya. There are way too many lab meetings. It’s a real waste of time, but that’s the cost of pulling from so many different realms of expertise in interdisciplinary projects.”
Wayne the Brain: “No no no, I am literally sick of lab meetings. All the exposure is really taking a toll on my health. ”
Wankdorf: “Why didn’t you say so?! Stay away, you purveyor of vile pestilence! ::cough::”

I hope, dear reader, that you spotted the root cause of their misunderstanding. Wayne (the Brain) was hypothesizing a suspected transmission rate while simultaneously advertising his own condition as definitely infected and possibly contagious. Wankdorf (unsurprisingly) misinterprets the statement by applying a more colloquial definition of the term “literally.” It’s not clear whether infection of the second researcher could have been avoided and the spread of the disease slowed had they practised more effective communication, but that scenario is plausible given what we know.

Of course this is an extreme example, and the consequences may not always be so dire. The most frustrating part of the above exchange and subsequent misunderstanding is that neither participant was strictly wrong in the definition they assumed for “literally.” This word now literally can be used to say “in the truest sense of the words” and the exact opposite, and my brain literally imploded when I learned about the new definition.

If you don’t believe me, check out the definition in both the Cambridge and Merriam-Webster online dictionaries. I’ve screenshotted the definitions to preserve this embarrassment for posterity:

merriamwebsterliterally

cambridgeliterally

Language is dynamic, some (Wankdorf etc.,) would even say that it is dynamical. Hence it doesn’t make you appear smarter to bore your friends by talking about Romans every time they say “decimate.” Language is constantly changing in response to the selective pressures of popular usage, subject to many factors as people and cultures interact.

Similar to many other examples of evolution, humans affect the way a language changes by taking note of and modifying the selective pressures they individually exert. The consequences may be particularly important in science, where English is the common tongue but not in general the first language of most practitioners. I expect that modern English will evolve to encompass multiple forms based on usage. Native speakers sat on the British Isles, laying in North America, and so on will continue to retain and invent complexity and idiosyncrasy, while international English will come to resemble a utilitarian version of Up-Goer Five English, paring off superfluous complexities while retaining the most effective elements to become as simple as possible, but no simpler. It’s possible that international English will even retain sarcasm.

Pop quiz: what’s your favourite English speaker idiosyncrasies used in this article?

A skeptic over coffee: who owns you your data?

AskDNA

“Everyone Belongs to Everyone Else”

-mnemomic marketing from Aldous Huxley’s Brave New World

A collaboration between mail-order genomics company 23andMe and pharmaceutical giant Pfizer reported 15 novel genes linked to depression in a genome-wide association study published in Nature. The substantial 23andMe user base and relative prevalence of the mental illness provided the numbers necessary to find correlations between a collection of single nucleotide polymorphisms (SNPs) and the condition.

This is a gentle reminder that even when the service isn’t free, you very well may be the product. It’s not just Google and Facebook whose business plans hinge on user data. From 23andMe’s massive database of user genetic information to Tesla’s fleet learning Autopilot (and many more subtle examples that don’t make headlines), you’re bound to be the input to a machine learning algorithm somewhere.

On the one hand, it’s nice to feel secure in a little privacy now and again. On the other, blissful technological utopia? If only the tradeoffs were so clear. Note that some (including bearded mo. bio. maestro George Church) say that privacy is a thing of the past, and that openness is the key (the 23andMe study participants consented that their data be used for research). We’ve known for a while that it’s possible to infer the sources of anonymous genome data from publicly available metadata.

The data of the every person are fueling the biggest changes of our time in transportation, technology, healthcare and commerce, and there’s a buck (or a trillion) to be made there. It remains to be seen if the benefits will mainly be consolidated by those who already control large pieces of the pie or to fall largely to the multitudes making up the crust (with plenty of opportunities for crumb-snatchers). On the bright side, if your data make up a large enough portion of machine learning inputs for the programs that eventually coalesce into an omnipotent AI, maybe there’ll be a bit of you in the next generation superorganism.

Through the strange eyes of a cuttlefish

A classic teaching example in black and white film photography courses is the tomato on a bed of leaves. Without the use of a color filter, the resulting image is low-contrast and visually un-interesting. The tomato is likely to look unnaturally dark and lifeless next to similarly dark leaves; although in a color photograph the colors make for a stark contrast, in fact the intensity values of the red and green of tomato fruit and leaves are nearly the same. The use of a red or green filter can attenuate the intensity of one of the colors, making it possible for an eager photographer to undertake the glamorous pursuit of fine-art salad photography.

Caprese_cherry_tomatoesBWColourComparison

The always clever cephalopods (smart enough to make honorary vertebrate status in UK scientific research) somehow manage to pull off a similar trick without the use of a photographer’s color filters. Marine biologists have been flummoxed for years by the ability of squid, cuttlefish, and octopuses* to effect exact color camouflage in complex environments, and their impressive use of color patterning in hunting and inter-species communication. The paradox is that their eyes (cephalopods, not marine biologists) only contain a single type of photoreceptor, rather than the two or more different color photoreceptors of humans and other color sensitive animals.

Berkeley/Harvard duo Stubbs & Son have put forth a plausible explanation for the age-old paradox of color camouflage in color-blind cephalopods. They posit that cephalopods use chromatic aberration and a unique pupil shape to distinguish colors. With a wide, w-shaped pupil, cephalopods potentially retain much of the color blurring of different wavelengths of light. Chromatic aberration is nothing more than color-dependent defocus, and by focusing through the different colors it is theoretically possible for the many-limbed head-foots to use their aberrated eyes as an effective spectrophotometer, using a different eye length to sharply focus each color. A cuttlefish may distinguish tomato and lettuce in a very different way than a black and white film camera or human eyes.

tomatoRGBcuttleVision

A cuttlefish’s take on salad

A cuttlefish might focus each wavelength sequentially to discern color. In the example above, each image represents preferential focus for red, green, and blue from top to bottom. By comparing each image to every other image, the cephalopod could learn to distinguish the colorful expressions of their friends, foes, and environment. Much like our own visual system automatically filters and categorizes objects in a field of view before we know it, much of this perception likely occurs at the level of “pre-processing,” before the animal is acutely aware of how they are seeing.

cuttleVisionKalamar

How a cuttlefish might see itself

seaCottonComp

A view of the reef.

A typical night out through the eyes of a cuttlefish might look something like this:

There are distinct advantages to this type of vision in specialized contexts. Using only one type of photoreceptor, light sensitivity is increased compared to the same eye with multiple types of photoreceptors (ever notice how human color acuity falls off at night?) Mixed colors would look distinctly different, and, potentially, individual pure wavelength could be more accurately distinguished. In human vision we can’t tell the difference between an individual wavelength and a mix of colors that happen to excite our color photoreceptors in the same proportions as the pure color, but a cuttlefish might be able to resolve these differences.

On the other hand, the odd w-shaped pupil of cephalopods retains more imaging aberrations in than a circular pupil (check out the dependence of aberrations on the pupil radius in the corresponding Zernike polynomials to understand why). As a result, cephalopods would have slightly worse vision in some conditions as compared to humans with the same eye size. Mainly those conditions consist of living on land. Human eye underwater are not well-suited to the higher refractive index of water as compared to air. We would also probably need to incorporate some sort of lens hood (e.g. something like a brimmed hat) to deal with the strong gradient of light formed from light absorption in the water, another function of the w-shaped cephalopod pupil.

Studying the sensory lives of other organisms provides insight into how they might think, illuminating our own vision and nature of thought by contrast. We may still be a long ways off from understanding how it feels to instantly change the color and texture of one’s skin, but humans have just opened a small aperture into the minds of cuttlefish to increase our understanding of the nature of thought and experience.

How I did it
Ever image is formed by smearing light from a scene according to the Point Spread Function (PSF) of the imaging system. This is a consequence of the wave nature of light and the origins of the diffraction limit. In Fourier optics, the point spread function is the absolute value squared of the pupil function. To generate the PSF, I thresholded and dilated this image of a common cuttlefish eye (public domain from Wikipedia user FireFly5), before taking the Fourier transform and squaring the result. To generate the images and video mentioned above, I added differential defocus (using the Zernike polynomial for defocus) to each color channel and cycled through the result three monochromatic images. I used ImageJ and octave for image processing.

Sources for original images in order of appearance:

https://en.wikipedia.org/wiki/File:Cuttlefish_eye.jpg

https://commons.wikimedia.org/wiki/File:Caprese_cherry_tomatoes.JPG

https://en.wikipedia.org/wiki/File:Kalamar.jpg


https://en.wikipedia.org/wiki/Coral_reef#/media/File:Sea_Cotton.jpg

And Movie S2

*The plural of octopus has all the makings of another senseless ghif/gif/zhaif controversy. I have even heard one enthusiast insist on “octopodes”

Bonus Content:

RGBTest

Primary color disks.

In particular, defocus pseudocolor vision would make for interesting perceptions of mixed wavelengths. Observe the color disks above (especially the edges) in trichromatic and defocus pseudo-color.

camoCuttle03

cuttleW

The aperture used to calculate chromatic defocus.

Bonus content original image sources:

Swimming cuttlefish in camouflage CC SA BY Wikipedia user Konyali43 available at: https://commons.wikimedia.org/wiki/File:Camouflage_cuttlefish_03.jpg

The aperture I used for computing chromatic defocus is a mask made from the same image as the top image for this post: https://en.wikipedia.org/wiki/File:Cuttlefish_eye.jpg

Perspective across scales (Spores molds and fungus* – recap)

*Actually just lichens and a moldy avocado

Take your right hand and cover your left eye. Keeping both eyes wide open, look at an object halfway across the room. You can now “see through your hand.”** Your brain compiles the world around you into a single image that we intuitively equate with media such as photography and video, but in fact (as evidenced by your brain ignoring your hand occluding half your visual inputs) this mental image of the world is compiled from two different perspectives. Therefore, the processing side of the human visual system is very well set up to interpret sterographic images. Some people complain about this but you can always file a bug report with reality if it becomes too much trouble.

Human binocular vision works pretty well at scales where the inter-ocular distance provides a noticeable difference in perspective, but not for objects that are very close or very far away. This is why distant mountains look flat [citation needed], and we don’t have good spatial intuition for very small objects, either. Stereophotography can improve our intuition of objects outside of the scales of our usual experience. By modifying the distance between two viewpoints, we can enhance our experience of perspective

For these stereo photos of lichens, I used a macro bellows with a perspective control lens. This type of lens is use for fixing vanishing lines in architectural photography or for making things look tiny that aren’t, but in this case it makes a useful tool for shifting perspective by a few centimetres.

Macr

stereoMacroLens1

It would probably be easier to move the sample instead.

stereoMacroSample

The images below require a pair of red blue filters or 3D glasses to shepherd a different perspective image into each eye, for spatial interpretation in your meat-based visual processor.

niceLichenAnaglyph

lichenAgainAnaglyph

anotherLichenAnaglyph

avocadoMold

curledLichenTM2016June

Another way to generate the illusion of dimensionality is parallax. This is a good way to judge depth when your eyes are on opposite sides of your head.

DSC_0042

DSC_0072

DSC_0051

curledLichenTM2016JuneGIF

**If you currently have use of only a single eye, the same effect can be achieved by holding the eye of a needle or other object thinner than your pupil directly in front of the active eye. This is something that Leonardo (the blue one) remarked on, and suggests the similarities in imaging with a relatively large aperture (like your dilated pupil) and an “image” reconciled from multiple images at different perspectives, e.g. as binocular vision.

Super Gravity Brothers

GW150914MorletSpec

The GW150914 blackhole merger event recorded by aLIGO, represented in a wavelet (morlet base) spectrogram. This spectrogram was based on the audio file released with the original announcment.

The data from the second detection, GW151226, is another beast entirely in that the signal is very much buried in the noise.

Raw data:

gw151226

Wavelet Spectrogram: gw151226CWTspec

The LIGO Open Science Center makes these data available, along with signal processing tutorials.

Now to see how the professionals do it:

I used MATLAB’s wavelet toolbox for the visualisations, aided by this example