## A Skeptic Over Coffee: Young Blood Part Duh

Does this cloudy liquid hold the secret to vitality in your first 100 years and beyond? I can’t say for sure that it doesn’t. What I can say is that I would happily sell it to you for \$8,000.

Next time someone tries to charge you a premium to intravenously imbibe someone else’s blood plasma, you have my permission to tell them no thanks. Unless there’s a chance that it is fake, then it might be worth doing.

Californian company Ambrosia LLC has been making the rounds in publications like the New Scientist hype-machine to promote claims that their plasma transfusions show efficacy at treating symptomatic biomarkers of aging. Set up primarily to exploit rich people by exploiting younger, poorer people on the off chance that the Precious Bodily Fluids of the latter will invigorate the former, the small biotech firm performed a tiny study of over-35s receiving blood plasma transfusions from younger people. It’s listed on clinicaltrials.gov and everything.

First of all, to determine the efficacy of a treatment it’s important that both the doctors and the patients are blinded to whether they are administering/being administered the active therapeutic. That goes all the way up the line from the responsible physician to the phlebotomist to the statistician analyzing the data. But to blind patients and researchers the study must include a control group receiving a placebo treatment, which in this case there was not. So it’s got that going for it.

To be fair, this isn’t actually bad science. For that to be true, it would have to be actual science. Not only does a study like this require a control to account for any placebo effect*, but the changes reported for the various biomarkers may be well within common fluctuations.

Finally, remember that if you assess 20 biomarkers with the common confidence cutoff of p=0.05, chances are one of the twenty will show a statistical difference from baseline. That is the definition of a p-value at that level: a 1 in 20 chance of a difference being down to random chance. Quartz reports the Ambrosia study looked at about 100 different biomarkers and mentions positive changes in 3 of them. I don’t know if they performed statistical tests at a cutoff level of 0.05, but if so you should expect on average 5 of 100 biomarkers in a screen to show a statistical difference. This isn’t the first case of questionable statistics selling fountain of youth concepts.

All of this is not to say that the experiments disprove the positive effects of shooting up teenage PBFs. It also generated zero conclusive evidence against the presence of a large population of English teapots in erratic orbits around Saturn.

You could conclude by saying “more scientific investigation is warranted” but that would imply the work so far was science.

* The placebo effect can even apply to as seemingly objective a treatment as surgery. Take this 2013 study that found no statistical difference in the outcomes of patients with knee problems treated with either arthroscopic surgery or a surgeon pretending to perform the surgery.

I

## A skeptic over coffee: sick of lab meetings

This post brought to you by a dedicated community of human Rhinovirus ( pdb model 1AYM).

Imagine the following dialogue between researchers:

Wayne the Brain: “Third one this week ::Cough:: I am literally sick of lab meetings.”
Wankdorf: “Oh I feel ya. There are way too many lab meetings. It’s a real waste of time, but that’s the cost of pulling from so many different realms of expertise in interdisciplinary projects.”
Wayne the Brain: “No no no, I am literally sick of lab meetings. All the exposure is really taking a toll on my health. ”
Wankdorf: “Why didn’t you say so?! Stay away, you purveyor of vile pestilence! ::cough::”

I hope, dear reader, that you spotted the root cause of their misunderstanding. Wayne (the Brain) was hypothesizing a suspected transmission rate while simultaneously advertising his own condition as definitely infected and possibly contagious. Wankdorf (unsurprisingly) misinterprets the statement by applying a more colloquial definition of the term “literally.” It’s not clear whether infection of the second researcher could have been avoided and the spread of the disease slowed had they practised more effective communication, but that scenario is plausible given what we know.

Of course this is an extreme example, and the consequences may not always be so dire. The most frustrating part of the above exchange and subsequent misunderstanding is that neither participant was strictly wrong in the definition they assumed for “literally.” This word now literally can be used to say “in the truest sense of the words” and the exact opposite, and my brain literally imploded when I learned about the new definition.

If you don’t believe me, check out the definition in both the Cambridge and Merriam-Webster online dictionaries. I’ve screenshotted the definitions to preserve this embarrassment for posterity:

Language is dynamic, some (Wankdorf etc.,) would even say that it is dynamical. Hence it doesn’t make you appear smarter to bore your friends by talking about Romans every time they say “decimate.” Language is constantly changing in response to the selective pressures of popular usage, subject to many factors as people and cultures interact.

Similar to many other examples of evolution, humans affect the way a language changes by taking note of and modifying the selective pressures they individually exert. The consequences may be particularly important in science, where English is the common tongue but not in general the first language of most practitioners. I expect that modern English will evolve to encompass multiple forms based on usage. Native speakers sat on the British Isles, laying in North America, and so on will continue to retain and invent complexity and idiosyncrasy, while international English will come to resemble a utilitarian version of Up-Goer Five English, paring off superfluous complexities while retaining the most effective elements to become as simple as possible, but no simpler. It’s possible that international English will even retain sarcasm.

## A skeptic over coffee: who owns you your data?

“Everyone Belongs to Everyone Else”

-mnemomic marketing from Aldous Huxley’s Brave New World

A collaboration between mail-order genomics company 23andMe and pharmaceutical giant Pfizer reported 15 novel genes linked to depression in a genome-wide association study published in Nature. The substantial 23andMe user base and relative prevalence of the mental illness provided the numbers necessary to find correlations between a collection of single nucleotide polymorphisms (SNPs) and the condition.

This is a gentle reminder that even when the service isn’t free, you very well may be the product. It’s not just Google and Facebook whose business plans hinge on user data. From 23andMe’s massive database of user genetic information to Tesla’s fleet learning Autopilot (and many more subtle examples that don’t make headlines), you’re bound to be the input to a machine learning algorithm somewhere.

On the one hand, it’s nice to feel secure in a little privacy now and again. On the other, blissful technological utopia? If only the tradeoffs were so clear. Note that some (including bearded mo. bio. maestro George Church) say that privacy is a thing of the past, and that openness is the key (the 23andMe study participants consented that their data be used for research). We’ve known for a while that it’s possible to infer the sources of anonymous genome data from publicly available metadata.

The data of the every person are fueling the biggest changes of our time in transportation, technology, healthcare and commerce, and there’s a buck (or a trillion) to be made there. It remains to be seen if the benefits will mainly be consolidated by those who already control large pieces of the pie or to fall largely to the multitudes making up the crust (with plenty of opportunities for crumb-snatchers). On the bright side, if your data make up a large enough portion of machine learning inputs for the programs that eventually coalesce into an omnipotent AI, maybe there’ll be a bit of you in the next generation superorganism.

## aSOC: Speculative Fiction

SF vs. SF: what’s the best way out of the werewolves and wizards section and onto the serious shelves?

Friendship and mentorship doesn’t need to be limited to those one has the chance to meet in a given lifetime. Thoughts recorded in the written word and other forms make it possible to take on teachers, fellows, and adversaries across vast swathes of space and time. A friend in a book, they don’t even have to listen, only speak. These can provide a welcome refuge when alone despite the crowd, adrift in a sea of humanity with an unrelenting feeling of solitude. A special case of being surrounded by an intellectual version of water, water everywhere, but not a drop of it to drink.

The back sections of bookstores have long been a strong attractor for the lonely imaginative ones, the awkward and detached (not quite that far back, you don’t need to know a password or ask the clerk to be let in). For those bookstores still standing, this section will be tucked away in a corner somewhere, maybe hidden on the top floor next to the owner’s apartment or sandwiched between the WC and the fire escape. This is the science fiction section.

Typically sci-fi is, ironically in some ways, lumped in with fantasy. One genre describes what might be possible while the other describes what is definitely impossible. It’s true that much of the so called SF genre (particularly the “indistinguishable from magic” variety) carries little to differentiate itself from your run-of-the-mill swords and sorcery. Despite this, the gold standard hallmark of the genre is an  element of science that, if removed, would diminish the story. That doesn’t mean the plot won’t be character driven or relatable, but it does give the writer the chance to experiment with people in an enhanced diversity of contexts. Within a single genre,  science fiction is tops for the sheer breadth of different stories, societal structures, and characters that are possible.

The stereotypical sci-fi enthusiast I describe above are perhaps a bit lonely and awkward, distracted from the normal world as it is and even a bit antisocial. The negative connotation of the sci-fi nerd as a misanthropic outcast is a convenient stereotype and oversimpification. The mindset and nuance that cause one to seek a realm apart is not so much a type of person but an aspect of human experience that we all dip into from time to time, with often creative and fulfilling results.

So unlike the societal myth of the unwashed legions of basement dwelling fandom, we probably all contain some antisocial nerd deep down inside. It’s a part of life with some valuable rewards in terms of introspection and preparing for an uncertain future. So why do some authors, typically literary types with degrees $\geq$ Masters, try so hard to distance themselves from the genre by insisting their work is “speculative fiction,” mutually exclusive to science fiction. Even not considering the continued migration of sci-fi fandom to the mainstream, claiming the spec-fic label distances your work from its obvious target audience while denigrating a useful and enjoyable mindset. It’s a bit pompous, a bit pretentious, and ultimately meaningless? The books may not qualify as hard sci-fi, but I promise not to be offended that they end up in the same section, with or without the accompanying speculative fiction proselytizing. Remember that speculation comes from the Latin specere, to observe, and is based on coming to conclusions about the world through thought. Science, beginning with observation, continues to rending untruth from plausibility based on experiment and exploration. Both begin with observation, where do your dreams stop?