Monday, June 30, 2014

Not only can they stand and walk, they can even run!

If I believed in God, I'd feel like the apple of her eye right now. It's because of the timing.

Just as I'm succumbing to a little old-fashioned thinking and deciding maybe now, at 34 weeks pregnant, I'll stop running and stick to the bike, a friend (an angel?) sends me this article in the USA Today about a woman who's also 34 weeks pregnant:

Alysia Montano just doing her thing, which happens to also show the world what pregnancy can be. (source)
She's still running like a cheetah.

And guess what? Cheetahs get pregnant too! Always have!

And you know what? Gazelles get pregnant too! Always have!

So whether you see hominins as the predators or the prey at any given time in human evolution, the pregnant ones, at least in our direct lineage, were obviously doing all right. (Flashback to similar thinking, here.)

There's footage of Montano on the move, and an interview about it all at ESPN. Those questions reflect some old-fashioned thinking that I wish the interviewer had explicitly distanced herself from. That's because we're in the future now. Women aren't asked to stop being active, strong humans just because they're growing a kid. In fact, those of us who are active and who have good healthcare and routine pregnancy check-ups are encouraged by our midwives and doctors to stay active, to stay fit throughout pregnancy.

Society expects non-pregnant women to do the same. Society also expects kids growing outside the womb to do the same. Is any of that any different? Do people actually think that all their female ancestors sat on ass while pregnant?

I guess some people don't think. Or maybe they haven't had a rich enough education in both history and natural history to give them the context for thinking about this. Maybe they take a doctor's (god's?) or a scientist's (god's) Word at any given moment, no matter how many decades ago it was uttered, as scripture, forever and ever amen. Especially when it comes to babies, those little miracles.

There are so many of us who've spent our entire lives being active, and as a result are pretty strong human beings. This goes for women who have no choice and must labor physically during their lives to stay alive and well and to keep their families alive and well. But this also goes for women in places like the U.S. where many women have had the opportunity to be athletes as little girls and throughout their lives.

Many of us products of Title IX are reproducing and we sure as hell aren't going to have healthy pregnancies being inactive, being someone else for nine months.

I hope every woman watches Montano and takes inspiration from her if she'd like to. And that might include getting a new and informed, dare I say futuristic, doctor if her current one's holding her back.

I also hope people see Montano running strongly (and disgustingly fast) so that they'll maybe think twice before admonishing a pregnant stranger for lifting that bag of dog food. Or for tsking at her for hiking on any inclines because she should know better and should stick to the flat paths congested with strollers and flip-floppers instead. 

Me, 32 weeks pregnant. On top of Mt. Sargent, Acadia National Park.
To the women and men of our society who have culturally limited expectations of the female body: Watch this video of Montano and then think, react, and act accordingly--with that much broader picture of "woman"--to the rest of us, pregnant or not.

We're everywhere, us active pregnant ladies, but it's something else to see such a prominent one put herself out there like Alysia Montano did. I hope that her two swift laps around that track will go a long way toward expanding minds about expanding uteruses.  

Sunday, June 29, 2014

Whoops I did it again

Sorry for publishing a draft, folks! 14 views before I noticed that I scheduled for the wrong day! It will be polished and posted tomorrow. Forgive me, I'm deep into summertime which means I hardly know what day it is, I eat pie for breakfast, and I resist footwear.

Friday, June 27, 2014

Who has all the answers?

We write about a lot of things on Mermaid's Tale.  All of us who contribute to this blog try to be thought-provoking in interesting and hopefully non-standard ways.  We don't have all the answers, and indeed we tend to write about things for which nobody has the answers.

We try to see the various (often more than the proverbial two) sides or facets to important problems in genetics, epidemiology, and evolution--and its philosophical and societal aspects.  What is clear is that many aspects of these areas, not least being the search for or even the notion of causation, are elusive and that in many ways our science in these areas has only the crudest forms of theoretical understanding.  Or, perhaps, we need some theory that differs from the physical sciences, in which the same things--electrons, oxygen molecules, gravitation attractions--are either completely replicable or deterministic.

In this context, we do use our modest forum to criticize the widespread claiming of having answers, and to examine why such claims are usually, like reports of Mark Twain's death, greatly exaggerated. Our society tends to reward those who claim and proclaim their work, and we try to temper that.

We try to be polite in what we write, and hope we don't go over the Snark line too often.  Sometimes a nerve (or an ideology) will be hit, and in our experience the response can be vehement to say the least, because our society currently doesn't give much respect to decorum. The uninhibited nature of the web brings out the worst in people, sometimes.  Our society, our academics, selfish aspects of capitalism, adversarial advertising culture, competitive careerism and our built-in systems of advocacy, too often don't lend themselves to better discussions.  In our lifetime it certainly has become less self-restrained, though perhaps every generation thinks the past was better than the present.

Nonetheless, there is far more that we (people in general, and scientists in particular) do not know than we care to acknowledge, and this leads to competition for attention and resources that we think should be resisted.  Criticism if properly placed can at least hope to nudge things in a better direction. Acceptance is a form of acquiescence.

Unfortunately, often neither we nor other critics have magic answers and "tell us what to do instead" is the retort of last resort from those who agree with the problems but want, conveniently, to carry on until someone instructs them to do otherwise--provides some new fad by which grants can be sought, and so on. That's not how science should be done.  If there are problems, and they are generally acknowledged, those who agree with the issues, and who are working in the area, should pause and try themselves to figure out truly better ways.  Isn't that supposed to be the job of science?  Unfortunately, doing that involves thinking, and the demands of writing grant applications for doing more of the same doesn't always leave time for that.

In addition, we think that searching for strange or perplexing results, even amidst claims of understanding, is a way we can try to stimulate more creative work, to the extent that anyone's listening.

In particular, it is young people, who want a career in science, philosophy, or other thinking professions, who need to take the baton and run with it.  That's the only way that change will happen.  We think this starts with asking "If the standard explanations aren't really right, what might be right instead?  If what is alleged to be true is not true, could the opposite be true in some way instead?"

In an often notoriously stiflingly institutional environment of universities, every stimulus to creative thought is, we think, worth the effort.  Even if we ourselves personally never know any answers to anything.  If young people don't try to cut through the spider web of institutional inertia, they will be the spider's next meal--a 50-year, career-long meal.

Thursday, June 26, 2014

Smart as can bee; leafcutter bees, another example

A week or so ago we blogged about a paper on bee navigation and other aspects of animal behavior, and whether such behavior can be said to be evidence of 'intelligence'.  We mused about the word, and whether ideas about whether animals other than ourselves are 'intelligent' or 'sentient' or--the real big Prize, 'conscious'.  That post generated quite a lot of interest.

It's perhaps a definitional issue, beautifully suited to endless debate and a guarantee of no solution.  But one thing is for sure, we think:  what the little old bee-brain guys are doing is quite complex.

In the interim, we read this beautiful and fascinating post by Hollis Marriot on her blog, "In The Company of Plants and Rocks."  Hollis is, we think it's right to say, truly a naturalist (though a professional botanist) who lives and works in Wyoming and blogs about botany, geology, nature, her travels, and more.  She's a beautiful writer, photographer and observer of the world.  The question this particular post raises is similar to ours about bee navigation or the problem-solving talents of crows.  In this case, she describes how solitary leaf-cutter bees cut a disk from nearby leaves, curl it up, tuck it under their legs like an architect carrying building blue-prints, and hie back home.  Home to a leaf-cutter bee is likely to be a crevice in an old piece of wood.  They build a dozen or so cells inside their crevice, lay an egg in each one and seal it with one of the leafy disks they've harvested a short distance away.


Leafcutter bee: Wikipedia

Now just think about any of these acts this behavior requires: finding an appropriate leaf (is it by some taste-test as well as size and so on?), then knowing how to cut a circular disk (without using a compass to inscribe it first!), then how to roll it up and tuck it between (well, among) their legs, and so on.  This is complex behavior and cannot entirely be pre-programmed.  That's because no a priori program can know where the tasty, cushiony leaves will be, nor how to fold them up, and so on.  It must look, smell, hear or whatever around its environment, resolve various images such as the trees and leaves, often when both it and they are moving, assess them, know how to work directed aeronautics of its wings and halteres to get there, and its many appendages to land its complex mouth and jaws to carve.  And then how to do the apparently simple thing of tucking it in amongst six (count 'em!) legs, then adjust its aeronautics so it can still fly properly back home.

To me, this is mental behavior, and whether or not you want to say it involves 'thinking' is basically a semantic question.  I personally would call it intelligent, far less robotic than, say, how amoebas mechanically and purely biochemically flow pseudopods towards food or respond to light.  It involves neurons, sensory systems, limbs and so on that use many of the same genes we use for the same systems.  Just because it's not doing full-blown trigonometry, what it is doing is still complex.

This is what leads me to think those who are being too restrictive about what counts as intelligence, as we discussed last week, are minimizing a very important, and fascinating question:  How can DNA and its coded products possibly achieve such feats?  This is not a mystical question, nor any invocation of mind-matter dualism immaterialism.  It's simply a willingness to acknowledge that our understanding of brain function, and the translation of linear codes to 4-dimensional actions, is at present elaborate in data-detail and paltry in substance.

From this point of view, none of us should be making pronouncements about what is 'genetic' and what 'must be' programmed and what 'environmental'.   This is another illustration of clear knowledge that should make us much more humble about what we claim to be knowledge--about bees, much less human behavior.

Wednesday, June 25, 2014

O, death, where is thy sting: the bees will continue to die

'Widespread impacts of neonicotinoids "impossible to deny,"' says the headline at the BBC.  The Worldwide Integrated Assessment (WIA) of the effects of systemic pesticides, a report released on June 24, concludes that the fairly new pesticides, neonicotinoids (neonics) and fipronil, are as bad for ecosystems as was DDT.  The report was written by the IUCN, the International Union for the Conservation of Nature, after a review of 800 peer-reviewed journal articles published in the last 20 years.  The IUCN is an environmental conservation group, and thus can't be said to be neutral in the debate over the role of these pesticides in ecosystems, particularly with respect to colony collapse disorder, but that certainly doesn't mean they are wrong.  

Systemic pesticides are water soluble chemicals and thus can be moved through a plant's vascular system to infiltrate its every cell, where they can remain for perhaps the life of the plant. Whatever eats the plant is then exposed to the pesticides, which, according to the BBC, are 6000 times more toxic than DDT (though, I am not sure what that actually means; that they last 6000 times longer, or kill 6000 times faster, or kill 6000 times the number of organisms?) but less toxic to mammals than some older classes of pesticide.
  

Healthy frame from hive, top; evidence of CCD, below;
Photos: Keith Delaplane from Oldroyd, 2007 and Reed Johnson;
Source
According to the report, neonics and fipronil are the third most widely used pesticides in use today, and because they are so widely used, and prophylactically as well as for pest control after the fact, they are found in the air, the water and soil.  And, they are affecting a wide range of organisms, perhaps including us.  Yep, you might be eating them for dinner.
The combination of their widescale use and inherent properties [of these pesticides], has resulted in widespread contamination of agricultural soils, freshwater resources, wetlands, non-target vegetation, estuarine and coastal marine systems. This means that many organisms inhabiting these habitats are being repeatedly and chronically exposed to effective concentrations of these insecticid
And,
The combination of prophylactic use, persistence, mobility, systemic properties and chronic toxicity is predicted to result in substantial impacts on biodiversity and ecosystem functioning.
The authors of this report recommend worldwide reduction in their use, continued research into alternatives, such as integrated pest management, and tighter regulation.

Pesticides are meant to kill, so that they are doing their job is no surprise.  But the most pressing question is whether they are also responsible for colony collapse disorder (CCD).  The European Union has banned the use of neonics for 2 years, in an attempt to answer this question.  However, there is currently some pressure from the National Farmers Union of combinable crops in Britain to overturn the ban in time for autumn rapeseed planting -- without them, growers warn, 2014 could be the last big harvest in Europe. Of course, the NFU can't be said to be a disinterested party, either.  

And, despite the report pesticide manufacturers (another party with a vested interest) deny that these chemicals are responsible for CCD, citing, for example, the persistence of tree bumblebees despite two decades of neonicotinoid use.  CCD, the sudden disappearance of worker bees from a beehive or colony, usually occurs with little or no build-up of dead bees in or around the affected hives, and with the early death of adult worker bees away from the hive.  This leaves affected hives populated primarily by young adults. The queen is usually in the hive as well, but without worker bees the colony can't sustain itself, and the remaining bees eventually die.

Winter is hardest on bee colonies, though since CCD was first reported, the proportion of beehives that die during a given winter has ranged from the expected 10% or so to bad winters such as the winter of 2013-14, with 60% or more of hives dying off in some areas.  Why this is happening is not at all clear -- perhaps varroa mites, perhaps a virus, perhaps overwork and stress, perhaps neonicotinoids, perhaps a combination of factors.  Indeed, CCD had not hit Canada or Australia until recently, although neonics have been in use in both places.  But, varroa mites weren't a problem in Australia, either, so it's hard to know what this means about the cause of CCD.

Colony collapse disorder has become politicized, in part because a lot of money is at stake, and in part because environmental protection is itself politicized.  It's the same kind of conservative vs liberal hate-fest, or, in this case, food-fight that we also see over the truth of evolution and climate change.  It doesn't help, of course, that understanding why bees are dying is so difficult.  But, similar questions arise in many other areas these days, like: what causes autism, or obesity, or hyperactivity?  Identifying the cause of a complex disorder turns out to be difficult, and 'cause' may not even be the appropriate word, though we seem to have no better.

The WIA report will not end the debate.  There will be charges of vested interest on both sides, and there will be inaction.  Environmental consequences of pesticide use will mount, and bees will continue to die.  But, for hungry humans, they may have the last sting.

Tuesday, June 24, 2014

Epigenetics and adaptive evolution: which wins?

(this is a slightly revised version of this post: see last paragraph)

Yesterday we discussed some aspects of epigenetics, that is, the modification of DNA that does not alter its nucleotide sequence but does induce or repress expression of genes in the modified area of the chromosome.  There were a lot of comments and replies on that post that you might want to browse.  

However, one reader tweeted a very interesting question that we'd like to address here where others could see his and our thinking. He wrote: Could epigenetic inheritance make genetic assimilation more potent since environmentally induced phenotype is multigenerational?

We thought this was worth addressing here, as it raises important issues about how evolution works.  

For readers who may not know the term, 'genetic assimilation' refers to a situation in which some environmental factor induces one of a particular set of possible states (sometimes called polyphenism).  This has long been studied in many different species, though much of the work was done in the early 20th century's pre-gene era--that is, when actual known genes were few and far between.  Earlier studies had to rely on traits for which there was reasonably specific evidence, even if indirect, that their variation was due to genetic variation.

Clearly we now know that many traits, including behavioral traits of various kinds, are affected by epigenetic changes as well as DNA sequence variation.  In itself, traits due to epigenetic changes have not seemed to be inherited:  Although since gene expression is directly the result of environmental changes a cell detects, epigenetic changes are a major mechanism of local adaptation.  However, the idea has been that state of an organism's trait would not be inherited.  Each organism starts life afresh in terms of its gene usage.

Yet there are by now many studies in various species, direct and indirect, observational and experimental, that show not just that changes in cell behavior involve epigenetic mechanisms, but that epigenetic changes may sometimes be inherited--perhaps even for multiple generations.  This has been seen as a threat to Darwinian theory, to the extent that such assertions have been sneered at as almost ante-diluvian Lamarckian nonsense.  Still, there is the evidence, and it's from legitimate investigators, not crack-pots, and from legitimate journals (well, some of them are like Nature and Science, that go for sensationalistic stories, sometimes not looking all that closely at the evidence).  So those who react against anything that smacks of environmentalism should stop and take a breath. 

There is nothing at all surprising about organisms reacting to their environment and since we are made of cells that express genes, about that reaction involving gene-usage changes.  Epigenetic changes are not mystic, mechanisms are clearly known, and they at most would change the criteria to which the term 'inheritance' is applied.  This in the same way that, because of increased understanding of DNA, the term 'gene' has rapidly lost its standard 20th century meaning--with no threat to basic genetic or evolutionary theory.

Genetic assimilation
This is a term coined by CH Waddington in the mid-20th century, but that applies to phenomena studied in the late 19th century under names like the Baldwin effect (for more on this history, you could see my 2004 Evolutionary Anthropology article, "Doin' what comes naturally").  Classic experiments were done to show that genetic assimilation can happen.  [Waddington was a quirky character who made lots of enemies and was dismissed by many, often for political reasons but that's irrelevant here]


CH Waddington
Traits, including behavior, were thought occasionally to be such that they might lead to higher reproductive success, and if in the same environment this success was consistently achieved by having the trait, it could be beneficial for mutations that arose, which generated the trait, to increase in frequency.  That is, if such mutations arose, selection could favor them so that eventually traits that had once been environmentally induced became genetically hard-wired.

How often this actually happens in nature is debatable and controversial because it has seemed by some to verge on non-Darwinian evolution.  But if or when it happens,, genetic assimilation would, in a sense, guarantee that the individual had the advantageous trait. This is what led our correspondent to ask whether epigenetic changes that could be passed down over some generations might give normal genetic evolutionary mechanisms a chance to occur, by presenting the favored trait to the environment more stably and consistently than if it depended on chance epigenetic mechanisms each generation.

The obvious answer is that this is certainly plausible. But it would have to persist for far longer than has been observed, to our knowledge, to match the slowness and random-mutational aspect of normal evolution.  As important, to us, is that if one thinks carefully about evolution, it is not so clear what would actually happen.

But good or bad?  Is epigenetic or genetic causation more 'fit'?
If the trait were hard-wired because of mutations, environmental induction wouldn't be necessary.  If the environment is present, the organism doesn't have to rely on any chance aspects of epigenetics to make it fit, relative to natural selection, compared to a less likely competitor who had to rely on that chance.  Of course, the mutations shouldn't reduce the chance of a response, say by erasing the DNA signal for epigenetic marking. But if they guaranteed the trait, the organism starts out life in an adapted state.

Yet one can ask when it is a good for a species to be hard-wired.  One could argue that it is not such a good thing, because the organism may be far less able to adapt to different or changing circumstances.  Depending on the species, its population size and habits and reproductive biology, it might be far better for each individual or local set of kin to adapt in an epigenetic way, when or if the environmental circumstance arises.  An epigenetic response would be reversible when environments change.  Let the environment do the talking, so long as the organism can respond to it.

However, an obvious analogue to natural selection applies to epigenetic traits: if they are really passed down from one generation to the next, that is itself like a form of hard-wiring that's only somewhat 'softer' than incorporating the trait into a deterministic DNA sequence.  It might be better not to transmit the trait even epigenetically.

If the environmental factor is utterly inevitable, hard-setting by DNA sequence might be as good as or even surer than epigenetic transmission.  But otherwise, maybe the risk of having to experience the environment and then adjust to it epigenetically, is worth the ability not to make that epigenetic change until it's necessary.  In other words, both genetic and epigenetic pre-determination may both be risky.  In a less certain environment, it may be better for each individual to learn by experience how to respond to what it faces; in that case, epigenetic marking is a good way to go, but not necessarily epigenetic transmission across generations.

Here we have some testable, sensible scientific issues with no obvious answer.  Of course, one first has to accept the reality of epigenetic effects and their transmission under at least some conditions, before one can even ask the question.  But whether conceptual assimilation by scientists is as real as genetic assimilation is not so clear.

Added after posting:  Today the current June 2014 issue of Trends in Genetics arrived in my box.  It has a nice review of epigenetic mechanisms, including discussion of the evidence (D'Urso and Brickner, vol. 30(6): 230-236).  It notes trans-generational effects, in the context of fitness and evolution.  It probably is not circumspect in this respect, in terms of how long such effects last (as discussed above), but is a good review for readers who would like to know more about the actual phenomenon and the evidence.

Monday, June 23, 2014

Epigenetics: the burden of proof vs the folly of dismissal

We haven't written much about epigenetics for a while, in part because it's so trendy that it's impossible to know what much of it means or how it's all going to shake out, and in part because there are so many different interpretations of the word that it's hard to know whether everyone's talking about the same thing.  Tools to detect epigenetic changes in the genome, that is, specific locations that have been chemically modified in ways that affect nearby gene transcription, are now available.  Still, it is clearly a fad in the sense that once tools are there the scientific community seizes them in a bandwagon effect, showing up in study designs and grant applications and so on, in ways that can exceed the reality.  Everybody now simply 'has' to do an epigenetic analysis on their favorite project.  And partly for this reason, not everyone else accepts that epigenetics will prove in the long run to be a significant actor in development and disease.

Epigentics: what is it?
We've posted about epigenetics in the past (e.g., here and here).  Relevant to today's post, the idea of epigenetic changes is that DNA can be chemically modified not by changing its nucleotide sequence itself, but by altering the packaging of the chromosome near genes that are to be expressed (or repressed).  Since different cells in the body are different because of their differential use of genes (cells in your brain and your skull have the same DNA, but turn on different sets of genes to become what they are, as well as to do what they do throughout life), this is simply a statement of one of the mechanisms by which it's achieved.  That's not the controversy.

The controversy goes a bit deeper and reaction to claims of epigenetic changes are emotional and often vehement.  There are two reasons for this.  First, what causes epigenetic marking of specific chromosome regions is the state of the cell at a given time, and that can change in response to the conditions it senses--its environment.  The evidence is, further, that until active mechanisms alter the epigenetic marking of chromosomes, the gene-expression pattern of the cell is inherited when it divides.  Since the inducing mechanism is part of the environmental situation of the cell, this gets uncomfortably close to Lamarckian inheritance.  The cartoon example of Lamarck's pre-Darwinian idea is the giraffe stretching to reach high leaves and, if successful, passing on long-neck genes to its offspring.  This is the antithesis of modern Darwinian theory (though Darwin himself toyed with it in his own theory of inheritance).  That Darwinian theory has randomly arising mutation being screened by natural selection to pick the successful genes: the lucky giraffe that happened to have inherited a long-neck genetic variant ate better and had more girafflets than its shorter-necked peers.  Here, the facts speak for themselves, and nothing known suggests that striving for something can in itself engineer heritable genetic change, specific DNA mutations, to make that something happen.

Masai giraffe; Wikipedia

But the second reason that epigenetics touches raw ideological nerves, especially in regard to humans, is that one school of thought wants to see everything human as written deterministically in our genomes: you (including your behavior) are what your DNA sequences prescribes.  Anyone offering any other suggestion is by this group widely denigrated without inhibition as a soft-headed denier of the importance of heredity.  That's because if environments really do affect your achieved nature, then genetic determinism and all that goes with it are no longer biological universals set in stone.

But what if, beyond environmental effects on gene expression in an individual during its lifetime, those effects were heritable into the next and future generations?  That would suggest that we are not just dealing with a fad made possible by a fancy bit of gear that can help you get a grant, but that there are things about our achieved natures and our evolution that we don't yet really understand.  And a couple of papers, one from last year, and one more recent, struck us as worth writing about, for different reasons.

Epigenetic---and then some?
A lot of attention was paid to a December 2013 paper in Nature Neuroscience ("Parental olfactory experience influences behavior and neural structure in subsequent generations", Dias and Ressler). Their point was first to show that your experiences involving odor detection are specific and can leave a long-lasting chemical and behavioral 'memory'.  Dias and Ressler exposed mice to a particular well-studied single-molecule odor, and coupled that exposure with a shock to the mouse's foot, to condition the animals to fear the odor (a logic resembling Pavlov's famous dog experiments).  This triggered the activation of cells that express a particular odor-detection (olfactory receptor, or OR) gene, out of the repertoire of about 1000 such genes, as well as the conditioned fear response upon smelling the odor, even absent the shock. But the authors reported something much more remarkable, and challenging to understand.

What they found was that the behavioral response to that same odor is activated in at least two future generations that had not been exposed to the fear-conditioning.
We subjected FO mice to odor fear conditioning before conception and found that subsequently conceived F1 and F2 generations had an increased behavioral sensitivity to the FO-conditioned odor, but not to other odors... Bisulfite sequencing of sperm DNA from conditioned F0 males and F1 naive offspring revealed CpG hypomethylation in the Olfr151 gene.  In addition, in vitro fertilization, F2 inheritance and cross-fostering revealed that these transgenerational effects are inherited via parental gametes.  Our findings provide a framework for addressing how environmental information may be inherited transgenerationally at behavioral, neuroanatomical and epigenetic levels.
A very important part of this is that the transmission was by males who had been conditioned, via their sperm, to females who had no such experience....and then to their sons' offspring (that is, the marking was present on the sons' sperm cells, directing hyper-expression of the OR gene in the grandchild-mice).

Reaction to this paper seemed to fall along party lines, with determinists doubting that the results could be real, and others intrigued.  Indeed, this is curious because the experience affects not just the startled males' odor-detecting mechanism in its nose cells where odors are detected, and fear response, but seems to imprint the specific effect on sperm cells.  There is no means known (to us, at least) by which this could occur, unless all cells' OR genes are affected during the F0 males' conditioning, nor are we qualified or patient enough to judge whether the study, or its set-up in some way has led to a misleading result.  To be fair, while the authors didn't demur to send their paper to a Nature journal, nor (in expected fashion) did the Nature journal demur from publishing without requiring such a mechanism to be shown, the authors themselves in fact did not venture a mechanism and recognized the issue in the Discussion.

If this sort of specific epigenetic mechanism does in fact persist across generations, without further conditioning, many questions are raised.  Not only is the targeting mechanism important to know, but since every generation has different experiences, what sort of expression mishmash would new pups have after millions of years of evolution in all sorts of environments?  Nonetheless, mice (and we, and trees, and even bacteria) are differentiating organisms that respond to environmental conditions in ways that certainly include altered gene expression.  So being skeptical may be fully justified, and this is not in any sense an "Aha!" moment for Lamarckians.  But its strangeness to what is currently known is also no reason to dismiss it because it doesn't fit your, say, genomic determinist or selectionist predilections. This is especially so because there is in fact a lot of evidence for environmentally induced changes in gene usage, and hence in the traits, of organisms including humans.  And that brings us to the other, more recent paper.

Do big bodies mean big epigenetic news?
The other paper is a recent report in The Lancet ("DNA methylation and body-mass index: a genome-wide analysis," Dick et al., 2014), which describes the results of a genome-wide analysis of methylation at CpG sites and its association with obesity, measured by BMI.  CpG refers to a C nucleotide being next to a G nucleotide along a DNA strand.  Methylation is a way of chemically attaching a small tag to that CpG in gene regulating areas of a chromosome, that makes it hard for the proteins that are needed to express a nearby gene to bind to the DNA to do their job.  That is, the expression of methylated genes is repressed.

A commentary in the June 7 Lancet applauds the work of Dick et al., and heralds the beginning of the "EWAS [epigenome-wide association study] era."  Of course, one's first reaction might be a sigh of 'here we go again!' in regard to hype far out-performing hope, a new fad for rescuing hopeless non-replicable findings, and journals having to sell copy and holding no standards of circumspection. But how should one react?

The possible significance of epigenetics to disease has not been lost on epidemiologists, and a new field called epigenetic epidemiology is abornin', counting on the importance of non-sequence modifications of DNA, in particular methylation and acetylation patterns, to (finally!) explain patterns of disease.  In that sense EWAS may be important, or may to a cynic just be an E-for-G swap to keep the GWAS funding flowing.

The Dick et al. paper is from this burgeoning field.  Epidemiology had many successes in the last century identifying environmental causes of disease, but when complex chronic diseases overtook infectious diseases as leading causes of death, the field had a much rougher time finding the causes of major diseases, and predicting who would get them.  So, epidemiology turned to genetics, but ran into the same problem genetics itself was up against -- complexity.  But if specific epigenetic changes can now be attributed in a useful way to environmental factors, on say the McFood-O-Meter scale, the claim will be that reductionist science has found the mechanism that shows that new epidemiological studies will have to be funded to focus on the risk factor that causes the epigenetic change.

It's not an entirely new idea.  For some years, George Davey Smith, an epidemiologist at the University of Bristol, has been advocating the use of 'Mendelian randomisation ', a strategy to see whether a variant in a gene whose function relates to processing some environmental factor has the same effect in people not exposed to that factor as those who are.  Maybe someone will cook up other prevention or treatment strategies if epigenetic mechanisms prove important.  

Dick et al. identified five methylation sites in the genome that in their sample were associated with being overweight by the Body Mass Index (BMI) criterion: three were in intron 1 of the HIF3A gene. HIF3A is a gene that regulates response to reduced oxygen levels.  The authors note that "Although the main focus on HIF has been its role in cellular and vascular response to changes in oxygen tension during normal development or pathological processes (eg, cardiovascular disease and cancer), compelling and increasing experimental data suggest that the HIF system also plays a key part in metabolism, energy expenditure, and obesity."

Have Dick et al. found the cause of obesity?  Well, no.  As the Lancet commentary points out, there are numerous difficulties in epigenetic research, a primary one being that a gene won't be modified in every tissue, nor all the time, nor even necessarily in every cell of the appropriate type in a given tissue.  That means that the choice of tissues in which to search for methylation and when to look are crucially important considerations.  Dick et al. did test various tissues, and found that methylation varied.

Another important issue in epigenetic studies is determining the order of events -- which came first, the disorder or the DNA modification?  That is, does the disorder lead to methylation of genes involved, or does methylation of related genes cause the disorder?
Dick and colleagues attempt to address the issue of causality by applying a mendelian randomisation approach to interrogate the causal relation between HIF3A methylation and BMI. This approach uses a genetic proxy for DNA methylation (namely, methylation quantitative trait loci) to identify a causal relation between an exposure or trait and epigenetic variation, assuming that genetic associations are largely immune to residual confounding and reverse causation. Dick and colleagues identified two upstream single nucleotide polymorphisms that were independently associated with DNA methylation at a HIF3A locus in both the discovery and replication cohorts. However, these single nucleotide polymorphisms were not associated with BMI in the study cohorts or the high-powered GIANT consortium dataset, suggesting that hypermethylation at the HIF3A locus is likely to be a result of increased BMI rather than a causal association between increased methylation and BMI.
So, apparently the obesity came first, methylation later, and has nothing necessarily to do with the cause of obesity. Interestingly, The Lancet still describes this as an important study. "Dick and colleagues’ study represents an important advance for both obesity-related research and the specialty of epigenetic epidemiology." Why? "The widespread uptake of instruments such as the Illumina 450K HumanMethylation array means that large collaborative EWAS meta-analyses can be done, building on the success of similar approaches in genetics."

Have instrument, will use it.

The burden of proof vs the folly of dismissal
It's early days yet in the understanding of the role of epigenetics in disease and behavior, and there's a lot left to be learned.  There is now a wealth of experimental literature on cells as well as a variety of laboratory species, demonstrating some of the mechanisms of gene regulation that involve epigenetic changes of DNA.  There are carefully done experimental studies that show multi-generational transmission of such changes. There have also been epidemiological and even experimental studies of intra-uterine or maternal experience affecting things like body weight in offspring.  Thus, even without specific epigenetic data at the genome level we have every reason to expect that life experience at any age could affect even complex traits.  And what would be more likely than some sort of epigenetic mechanism to be responsible?

One should also keep in mind that trans-generational correlation can look very much like regular genetic transmission and make a trait look 'genetic' in the classical sense, rather than in the epigenetic sense.

It clearly befalls those advocating, and those dismissing, epigenetic inheritance to keep their powder dry until we can see more clearly into the whites of the genome's eyes.  In fact, since we are obviously differentiated organisms descended from a single cell, who respond in all sorts of physical and behavioral ways to our internal and external environments, it seems obvious that some such mechanisms are fundamental to genome function, as experience clearly suggests.  But how well complex traits like body shape or odor detection would be transmitted not just across cell divisions in specific types of responding cells, but also across generations, is far from clear.

Keeping our powder dry should be automatic for scientists, as this is a very important question well worthy of careful investigation.  But whether we can keep obfuscation by ideology and equipment salesmen at bay is just as serious a question.

Friday, June 20, 2014

Walk this way, over to The Winnower

Holly has a new piece up over at The Winnower in which she describes the difficulty of getting students to understand the nuances of what's known about Neanderthals. Fine piece, very much worth a look.

The Winnower is a new open access online journal, with post publication peer review, also very much worth a look.

Thursday, June 19, 2014

Correlation, cause and how can you tell?

A very good BBC radio program called More or Less tries to point out uses and, mainly, misuses of statistics and data analysis, especially where policy or political claims and the like are involved.  A recent program (June 9) addressed the problem of confusing correlation with causation.  This is something everyone should know about, even if you're not a scientist.  Just because two things can be seen together does not mean that one causes the other.

The program mentioned an entertaining web site, Spurious Correlations, that you can find here.  It's run by Tyler Vigen, a Harvard law student, and it makes the correlation/causation problem glaringly obvious. There is even a feature for finding your own spurious correlation.

Number of people who died by becoming tangled in their bedsheets
correlates with
Total revenue generated by skiing facilities (US); Source: Spurious Correlations

How to tell if a correlation is spurious is no easy matter.  If many things are woven together in nature, or society, they can change together because of some over-arching shared factor.   But just because they are found together does not mean that in any practical sense they are causally related. Statistical significance of the correlation, meaning that it is unlikely to arise by chance, is a subjective judgment about what you count as 'unlikely.' After all, very unlikely events do occur!

Causation can be indirect and in a sense it is not always easy to understand just what it means for one thing to 'cause' another.  If wealth leads to buying racy cars and racy cars are less safe, is it the driving or the car, or the wealth that 'causes' accidents?  If AIDS can be a result of HIV infection, but you can get some symptoms of aids without HIV, or have HIV without the symptoms, does HIV cause AIDS? If the virus is indeed responsible, but only drug users or hookers get or transmit it, is the virus, the drug-use, or prostitution, or using prostitutes the 'cause'?

Another problem, besides just the way of thinking of causation, and the judgment about when a correlation is 'significant', relates to what you measure and how you search for shared patterns.  If you look through enough randomly generated patterns, eventually you'll find ones that are similar, with absolutely no causal connection between them.

Looking at the examples on the above web site should be a sobering lesson in how to recognize bad, or overstated, or over-reported science.  It won't by itself answer the question about how to determine when a correlation means causation.  Nobody has really solved that one, if indeed it has any sort of single answer.  And there are some curious things to think about.

Just what is causation?
In a purely Newtonian, deterministic universe, in a sense everything is causally connected to everything else, and was determined by the Big Bang.  For example, with universal gravity everything literally affects everything else and through a totally deterministic causal process.

In that sense nothing at all is truly probabilistic.  But quantum mechanics and various related principles of physical science hold that some things may really be truly probabilisitic rather than deterministic. If that is right, then the idea of a 'cause' becomes rather unclear.  How can some outcome truly occur only with some probability?  It verges on an effect without an actual cause.  For example, if the probability of something happening is, say, 15%, what establishes that value--what causes it?  A systematic or random process with a truly random cause, that is not just our inability to measure it precisely, in a sense redefines the very notion of cause.  Such things, some of them seemingly true of the quantum world, really do violate common sense.  So the whole idea of correlation vs causation takes on many different, subtle colors.

Wednesday, June 18, 2014

Republican presidents are bad for our health?

Infant mortality in the US fluctuates with the political party of the President.  A paper published in the  June 4 issue of the widely respected International Journal of Epidemiology ("Us Infant Morality and the President's Party," Rodriguez et al.) reports that between 1965-2010, infant mortality rates were 3% higher when a Republican was president than when the president was a Democrat.

Rodriguez et al. write that previous "political epidemiology" has been cross-national, and has attempted to determine the effect of policy on public health; welfare states, national health systems vs not, higher social expenditures or medical expenditures per capita vs lower, etc. Income inequality was found to be correlated with public health until the data were re-analyzed and additional variables controlled (Avendano, 2012), leading Avendano to question whether income inequality was in fact causal, as opposed to either spurious or real but only correlation with some unmeasured variable(s).  Further studies have attempted to determine which social factors are actually causal; some have suggested social expenditure and the generosity of family policies may be. In this way, political policies may affect public health, but actual causality, rather than just correlation, is difficult to determine.

Rodriguez et al. posed the question, ‘Is the political party of the president of the USA associated with an important, objective and sensitive measure of population health, infant mortality?’ The idea is that the party in power drives macroeconomic policy, and macroeconomic policy influences the socioeconomic milieu, affecting variables that affect health and mortality.

Infant mortality has fallen dramatically since 1965, from a total of 24.7 per 1000 births to 6.1 in 2010.  In the graph below, the authors have removed the trend, and show total infant mortality, neonatal and post-neonatal mortality, by president, for blacks and whites.  During Democratic administrations, all rates are lower, across the board, on average 3% lower.



Logged IMR, NMR, and PMR residual trends and presidential partisan regimes, 1965–2010; Source, Rodriguez et al., 2014
The statistical effects are essentially the same for Blacks and Whites, but infant mortality among Black infants is about two times higher than it is among Whites, so the absolute effect is larger.  The percentages may be small -- very small, in fact -- but their consistency does lend them credibility.

Several things stand out about these results.  First, as the authors point out, the implementation of policies that might have had a direct effect on infant mortality -- Johnson's Great Society and Medicaid in the 1960's, or expansion of Medicaid eligibility between 1979 and 1992 don't correlate with these periodic dips in IMRs.  That would be the easy explanation.  But this means that the correlation with political party may have little or nothing to do with policy differences.

Or, Rodriguez et al. suggest, the correlation could reflect real, cyclical changes in socioeconomic conditions for mothers and infants, depending on national policy.  Or, differential availability of abortion, since high risk fetuses may be more likely to be aborted than fetuses at lower risk.  Or, it might reflect differing attitudes toward health disparities, with Democrats more likely than Republicans to use government to address them -- but what actual governmental policy is implemented, or eliminated, and thus responsible for the fluctuations is anybody's guess.

But there's something curious about these findings.  Neonatal mortality, death before 28 days of life, is generally considered to be due to conditions of pregnancy or congenital abnormalities, while post-neonatal infant mortality, death between 29 days and 1 year, includes sudden infant death syndrome, which isn't correlated with socioeconomic status, but PMR is also considered to be a reflection of socioeconomic conditions.  If that's so, then neonatal mortality should look quite different from post-neonatal mortality in this study, but it doesn't.  It shouldn't be fluctuating with policy differences or income inequality or whatever political or economic factors, if any, might be responsible for the trend reported here.  And, one would expect there to be a more marked difference between Black neonatal and post-neonatal mortality, since health disparities are most reflected in Black infant deaths.

Equally problematic is that one might expect that presidential terms are short relative to the lag time between implementing a new policy and its effects.  The study did allow for a one year lag time, but still, most health policies don't have immediate impact. So the incumbent's party may be irrelevant to what happens during his term, or it would at least be the successor's (sometimes the same sometimes different) party.  Do people's expectations, based on the current President's outlook, change their behavior in subtle ways? Sounds plausible, and would have nothing to do with the policy change itself, but so many people are uninvolved, uninterested in, or skeptical of the political system that this might not be much of an explanation. And the pattern goes back before CNN and FOX imitation news organizations had much intentionally motivating influence on what people thought or were aware of.

Still, social and political epidemiology are interesting approaches to understanding the underlying causes of ill health and mortality.  The fields look at risk factors several steps removed from those generally considered as causes of disease, so that AIDS, or malaria, e.g., might be attributed to poverty rather than HIV infection or being bitten by a parsite-carrying mosquito, and legitimately so.  That is, the idea is that poverty increases one's risk of exposure to diseases, and if you eliminate poverty you eliminate risk.  The difficulty, of course, is that enacting public health policy that calls for eliminating poverty is a lot more difficult than distributing bed nets or clean needles.

And, the problem of identifying cause from correlation is huge with such metadata.  It can be pretty much pure guess work to pull causal factors from the social or political hat, as this paper suggests -- in fact, if something like, to make something up, differences in completeness of registration of vital statistics in Republican and Democratic years were responsible for this cyclical dip, it would look just the same as if the cause were changing policy. The point is, that we just don't know.  In addition, it's hard to avoid interpreting results from one's particular political point of view -- maybe there's something interesting in this paper, maybe there's not, but it's very hard to know.

Tuesday, June 17, 2014

Are bees intelligent?

The other day, Ken and I had coffee with a couple of philosophers who spend their time thinking about philosophy of the mind.  What is consciousness?  Do non-human organisms have consciousness?  What is intelligence?  How do we make decisions?  What about ants?  These are hard questions to answer, perhaps even unanswerable, but they are fascinating to think about.

Our meeting was occasioned by the recent paper in PNAS about the mental map of bees ("Way-finding in displaced clock-shifted bees proves bees use a cognitive map", Cheeseman et al.).  Cognitive maps are mental representations of physical places, which mammals use to navigate their surroundings.  Insects clearly have ways to do the same; whether or not they do it with cognitive maps is the question.

Honeybee: Wikipedia


The "computational theory of mind" is the predominant theory of how mammals think -- the brain is posited to be an information processing system, and thinking is the brain computing, or processing information (though, whether this is 'truth' or primarily a reflection of the computer age isn't clear, at least to us).  In vertebrates some at least of this takes place in the section called the hippocampus, or in non-vertebrates in some neurological  homologs.  But, what do insects do?  

Previous work has shown that captured insects, once released, often fly off in the compass direction in which they were headed when they were caught, even if they were moved during capture and the direction is no longer appropriate.  But, they then can correct themselves, and then have no problem locating their hives. That indicates that they've got some kind of an "integrated metric map" of their environment.

Some theories have held that they mark the location of the sun relative to the direction they take and then later calculate 'home' based on a computation of time and the motion of the sun.  This by itself would be a lot of sophisticated computing, or thinking....and why not 'intelligence'?

Cheeseman et al. asked whether instead what they are relying on is a series of snapshots of their environment, which enables them to recognize different landmarks, one after the other as they come into view, rather than a completely integrated mental map.  They experimented with anesthetizing bees and shifting their sense of time, so that they couldn't rely on the sun to get them home.  It took some flying for the bees to recognize that they were off-course, but they always were able to re-orient themselves and get back to the hive.

Cheeseman et al. conclude that that because bees don't rely entirely on a sun-compass for their sense of direction, they must have the apian equivalent of a cognitive map.  That is, they collect relevant spatial information from the environment with which they navigate, and use it to make decisions about how to get where they are going. That is, they take and file away snapshots; remember that insect eyes are complex, including two compound eyes and in most species three forehead-located small, simpler ocelli so this is synthesizing a many-camera pixellation and differently sensitive integration of the light-world. Then, they use a sequence of these frames, later, from a different position from that at which the photos were taken so not all landmarks might even be visible, and at a different time, which can affect shadows, colors, and so on.  Then, tiling these lined up in reverse order in mirror left-right flipped order somehow, and adjusting their angles of perspective and so on, also perhaps sound, wind direction, and even perhaps monitoring the olfactory trail (also in reverse relative position) like Hansel and Gretel's bread crumbs, they head home for dinner.

Two big compound eyes, and 3 simpler central ocelli. From http://169.237.77.3/news/valleycarpenterbees.html


To us, this is a remarkable feat for their small brains!  For some of us, even with a human brain, finding one's way home without a GPS is no easy task, and deserves a nice cold drink when done successfully.  However, the philosophers we were chatting about this with did not think what Cheeseman et al. believe they discovered about bees should be called a cognitive map because, and we think we've got this right, they haven't got a mental image of the entire lay of the land.  Instead it's as though they are connecting the dots; they recognize landmarks and go from one mental snapshot with a familiar landmark to the next. So what kind of 'intelligence' this is becomes a definitional question perhaps.  Call it mechanical or whatever you want, we would call this 'intelligent' behavior.

We don't know enough about philosophy (or the biology) of the mind to know how significantly these two models differ, or whether 'consciousness' is subtly underlying how these judgments about cognition are made, but in any case, that's not what interested us about the bee story.  What is the experience of being a bee?  Whichever kind of imaging and processing they do to navigate, how do they turn the locational information into action?  It's one thing to know that your hive is east (or the apian equivalent) of the pine tree, but getting there requires "knowing" that after you've collected the nectar, you then want to bring it home, and that means you have to find your way there.  Your mental map, whatever it consists of, must be made operational.  How does that happen, in a brain the size of a bee's? Or an ant's?




Or bird brains?  Crows, corvids, are considered among the smartest of birds.  Their problem solving skills have been documented by a number of researchers, but crows have fascinated many non-scientists as well, including our son, who sent this observation from Lake Thun in Switzerland.
Crow found a little paper cup with some dried out dregs of leftover ketchup in the bottom. This is the sort of little paper condiment cup that would come with some french fries. We watched the crow try a couple of times to scrape some ketchup out with his beak, holding the cup down with his foot. It apparently wasn't working enough to his satisfaction, so he flew with the cup to the edge of the water (we were at the lake). He wanted to get the ketchup wet to "hydrate" it, to make it easier to scoop out. That was impressive enough, but what he did next was even more. There were little waves lapping on the "shore" (this was actually in a harbour and the shore was concrete) and each time threatening to carry away his cup. So he picked up the cup and carried it along up and down the shore until he found a little crevasse in the concrete that he could secure the cup, and let the water wash over it without taking it away. Clever.
If that's not intelligence, it's hard to know what it is, then.

One view of intelligence is that it's what's measured by IQ tests.  Or, at least, what humans think 'thinking' is all about.  But this is perhaps a very parochial view.  We tend to dismiss the kind of intricate brainwork that is required by nonverbal activities, or by athletes, or artists, or artisans.  We tend to equate intelligence with verbal kinds of skills measured on tests devised by the literate segments of society who are using the results to screen for various kinds of western-culture activities, suitability for school, and the like. There's no reason to suggest that those aspects of brainware are not relevant to society, but it is our culturally chosen sort of definition.

Philosophers and perhaps most psychologists might not want to credit the crow with 'intelligence', or they may use the word but exclude concepts of perceptual consciousness--though whether there are adequate grounds for that that are not entirely based on our own experience as the defining one, isn't clear (to us, at least).  In any case, wiring and behavior are empirically observable, but experience much less so, and consciousness as a component of brain activity, and perhaps of intelligence, remains elusive because it's a subjective experience while science is a method for exploring the empirical, and in that sense objective world.

If bees and, indeed, very tiny insects can navigate around searching the environment, having ideas about 'home', finding mates, recognizing food and dangers, and they can do it with thousands rather than billions of neurons, at present we haven't enough understanding of what 'thinking' is, much less 'intelligence', to know what goes through a bee's or a crow's mind when they're exploring their world....

Monday, June 16, 2014

The misappropriation of human diversity

We somewhat cautiously decided to chime in on the issues driving the widespread internet and media exchanges about the publication of what we and many others who think about these issues for a living know presents a rather simplistic and overtly and gratuitously racist treatment of human genetic variation.  We wanted to point out, from our view at least, why the underlying issues are complex and don't yield to simple answers.  In addition, we don't think it's at all irrelevant that history shows the societal traumas that can result from the arrogantly casual, careless, categorical, and value-laden treatment of one group of people by another.  This has been called scientific racism in the past, and it's scientific racism now.

We tried to avoid personal attacks or polemics, and just lay out the issues as we see them.  However, and not surprisingly given the nature of the current 'discussion', this opened the floodgates.  Let's just say that the number of ways that people can find to tell strangers, under cover of anonymity and with such glee, that they are the scum of the earth is impressive.  This whole subject merits measured discussion, based on evidence -- and many people have treated it as such, and we have appreciated what they've had to say, and the thoughtful, knowledgeable way they've said it.

But then there's the rest.  The internet can be an ugly place.  We have chosen not to contribute to that by posting comments that, let's just say, don't move the discussion forward.  At present, it's impossible to have a wide-ranging discussion of human variation in any scientific way because the subject has been appropriated by ideologues with either a superficial understanding of the science, or a willingness to pick and choose the evidence that suits their purposes.  There are plenty of places to find their rants.  But not here.

Friday, June 13, 2014

The pseudo-science of race claims

For more than a century there have been debates about how many human 'races' there are, that is, categories of people, the assumption being that these are natural categories, or types, of people.  Charles Darwin addressed this question in The Descent of Man (Darwin (1871), vol. 1, p. 226), but he was far from the first:
Man has been studied more carefully than any other organic being, and yet there is the greatest possible diversity amongst capable judges whether he should be classified as a single species or race, or as two (Virey), as three (Jacquinot), as four (Kant), five (Blumenbach), six (Buffon), seven (Hunter), eight (Agassiz), eleven (Pickering), fifteen (Bory St. Vincent), sixteen (Desmoulins), twenty-two (Morton), sixty (Crawford), or as sixty-three, according to Burke.
In numerous recent posts, we have tried to examine the nature of the genetic arguments, showing why people don't in fact fall into such categories and that there can be much more scientifically accurate and politically less loaded ways to view human variation.  We have pointed out the obvious fact that 'race' is a concept defined by people, and in that sense is a sociocultural construct.  But we have also  pointed out that humans do vary geographically for evolutionary genomic as well as cultural and historical reasons, and to deny those elements of difference is equally wrong.  The subtleties and nuances of the facts that we actually know are harder to deal with. Categorizing is easy, it can have emotional satisfaction ('us' vs 'them').  And its much easier than actual thinking.  But history shows its toll.

The killing parade: equal opportunity








Equal opportunity haters:  Brownshirts,  Crusaders, Kulaks, Jihadis, Southern gentlemen, Kristallnacht, Mao's bourgeois enemies, Palestinian life, Rwanda (as best I can identify using Google images).

Few if any group of any size or historical duration has avoided the temptation of energized hate-filled oppression of others.  There is usually a piously expressed reason why the victims are categorically guilty of some offense that makes them inherently unworthy.

In the course of a resurgence of categorical racism and genetic destiny-worship, expressed in feigned dispassionate discussion about "why can't we revel in human genomic diversity?", what quickly appears is the claim that modern genomic science 'really' shows us truth (uncomfortable though it may be to you).  We immediately started seeing internet hate traffic saying that those who suggest that environment has something to do with racial traits are [fill-in-the-blank], meaning stupid and ignorant and worse.  Invoking environment as a cause of human variation, those chatterati say, willfully and in the name of political correctness, ignores the underlying genomic truth, and one quickly sees that that means those guys really are inferior to these guys (usually meaning the speaker's clan), the inferiority is inherent to their category--and especial attention is usually paid to behaviorally sensitive traits, like intelligence.

We have seen it alleged that those who deny the existence of races as genetic categories will be the cause of many deaths in minority populations, because by denying that there are traits that may be distinguished by asking people to identify their 'race', they will be deprived of needed genomic health care.  Instead, the opposite is true: people would be better off if treated as individuals with their unique genomic and cultural context, than as categories within which we know there is considerable variation (a topic we've also recently discussed).   And of course this accusation ignores the facts that it's going to be a long time 1) before genomic medicine is a major part of health care, 2) that many pushing race-specific medical treatment are gaming the patent system for drugs they sell, and 3) that it will be even longer before it's available to anyone but the rich who don't proportionately represent all the 'races' in the population.

The idea that race exists as a genetic but not a social construct is advocated, despite such obvious facts as that people with 'race'-mixed parents are assigned to one or the other parental 'race'.  Even more, the accusation is made that reality deniers object to genetic classification of races because they deny the existence of race at all.  This is an awful, and often awfully intentional, misperception.  A person's identity involves far more, and far stronger, cultural than genetic factors, and thus 'race' is very and truly meaningful to many people.  Social standing, opportunities, legal rights, and discrimination are manifest aspects of what most people mean by 'race'.

This aspect of race is clearly a sociocultural fact, a fact of the world worthy of scientific study--even if it does not imply or is caused by race as a genomic category of nature.  People have been gassed and lynched and denied opportunity or basic rights in large numbers not because of their genomes, but because of their cultural category; you can't get more real than that.  One need not pretend that race categories don't exist, but can insist that by and large they are culturally based constructs that are only partly, and even inconsistently, determined by physical (or genetic) traits.  The consequences are an undeniable aspect of the factual reality of 'race'; in that sense race, as categories, certainly exist.

We have also seen, in the gurgling internet hate traffic, assertions that some Jewish critics of 'human biodiversity' oppose racial determinism because they are Jewish (and, incidentally, liberals), while the authors of a lot of the hate discussion are not themselves Jewish.  This is rather far off the wall.  The current batch of race authors, though not Jewish, actually claim that Jews are genomically superior to other races when it comes to intelligence (because they were selected for money-lending).   From that point of view, Jews should be eagerly embracing the new racism bestowed upon them by non-Jewish authors!  Indeed, we have several times had Jewish colleagues sidle up, quietly, and ask sotto voce whether we don't think, as geneticists, that though one can't say it out loud, Jews really are genetically smarter. Or whether we'd have to agree, though we couldn't say that out loud either, that there was something genetically wrong with Arabs.

The images we show above are intended to remind readers that the treatment of others as being inferior or unworthy, as a group, which justifies doing the most awful things to them, is rather widespread, with a wide range of ideological rationales, including 'science'.  It's humans ganging up on other humans for a whole panoply of emotionally self-reinforcing reasons.  Since Darwin, it has often been asserted that the inherent value differences of groups are Evolution's will rather than God's, but the commonality is that the judgment is always credited to some external and hence objective force relative to the denigrating speaker, and applies to purported group characteristics.

There is misunderstanding galore about how human genomic variation is distributed around the world, how it got that way, and how an individuals' genomes, cultural and physical environmental circumstances affect their traits.  There is a culpable willingness to ignore the complexities as well as the human sensitivities about all of this.  Some traits are more clearly 'genetic' and others more 'environmental', a topic we've recently discussed.  But the recent delving into complex traits, especially behavioral ones with societal and emotional content, has gone right off the deep end, after which serious, measured scientific discussion no longer takes place and, sadly, otherwise sub rosa racism and policy objectives quickly float to the surface (justified in the name of objective truth).

The facts of human variation are sometimes uncomfortable, sometimes deal with things that can't easily be changed (like skin color or cystic fibrosis), as well as things that change so fast that clearly the cultural environment is largely responsible.  The facts do not need artificial categories when more accurate, more quantitative analysis can be carried out without doing what is often hateful violation to reality.  But people like the emotional side of hatred.

Thinking about the facts of the world is very difficult and there are often no easy answers to scientific questions.  It's a whole lot easier to jump on a bandwagon that offers you thought-free reinforcement of your own personal value 'category', and whips up tribal animus towards others of a different view. 'They' become objects, things to which you can do whatever with impunity, because they are inherently inferior....just like animals.

Hating is more fun, more energizing, and a lot easier than thinking.

Thursday, June 12, 2014

The Big Bu(r)st! Cosmic Microwave Inflation....deflates

If a WorldView commentary by Paul Steinhardt, a Princeton physicist, in the June 5 issue of Nature is more reliable than much of what we find in Nature, the Big Story of the Big Bang was a Big Bust.  Remember the piece in the March 17 issue reporting that "Astronomers have peered back to nearly the dawn of time and found what seems to be the long-sought ‘smoking gun’ for the theory that the Universe underwent a spurt of wrenching, exponential growth called inflation during the first tiny fraction of a second of its existence"? Well, it turns out that this may not be true after all.  Or at least it's still being debated, as another Nature commentary in the same issue describes. The announced result, which Steinhardt writes influenced academic appointments and decisions on paper publications, funding, and the like, was apparently, he says, just a misinterpretation of the data, but forthcoming papers will carry on the discussion.

We personally know as little about cosmological physics as some people think we know about genetics, but even so we venture to say that (as the Commentary author pointed out) the treatment of the original report by the news media, and hence also by the reporting scientists, shows a lot about the (sorry) state of an important aspect of science today:  it is too much about advertising and too little about science.

In essence, if we look in all directions at the cosmic radiation we can detect reaching our area of space, it is very even--not very clumpy.  In a generic big bang theory, the space splatter that happened at the very beginning will eventually be brought by gravity back to its origin, with lots of clumps of matter (galaxies and the like) in the process.  But while there are on the order of 100 billion galaxies with about 100 billion stars each, plus lots of other rocks and matter and energy out there, and lots more dark matter and energy, everything is distributed far more uniformly throughout space than one might have expected.

But if, as Alan Guth suggested in the 1980s with his cosmic inflation theory, there was a burst of very rapid expansion way back just after the Big Bang itself, then this 'inflation' should have generated ripples in space-time, which we should detect as irregularities in what is now the cosmic microwave radiation reaching the earth which was reported to reflect, as we understand it, ripples in gravity due to this initial inflation 14 billion years ago.

History of the Universe - gravitational waves are hypothesized to arise from cosmic inflation, a faster-than-lightexpansion just after the Big Bang (17 March 2014). Wikipedia

The report of this finding of cosmic microwave ripples lent support to various aspects of cosmological theory and when breathlessly announced and picked up equally breathlessly by the media, talk of Nobel prizes, astounding discovery about the nature of everything, and so on were trumpeted from the media-tops.  Only a little skepticism or reserve was expressed by pundits and other cosmologists, who gushed with praise over this long-awaited report.

The problem is, apparently, that other sources of twisty microwave radiation exist that are unrelated to any early expansion and are more likely to account for the ripples that were reported.  There may be nothing inflationary there after all, or if there is something that survives scrutiny it may only be slightly more likely to be due to inflation than to other things.

Forgiving any errors in our above account, which aren't important to our objective, this is how things stand today. We note yet again the role that the media and hype-machinery are playing in today's science. The more important the claim, the more rigorous and extensive should be the evidence for the claim. In this case, the inflation-finding study was not even published when it was announced and picked up with great melodrama by the media.

Suppose one claimed that, after millennia of searching for more than just faith as evidence, s/he had definitive proof of God's existence.  For example, claiming that God had five fingers on each of two hands (we are, after all, said to be in his likeness and image) was proved because of how He enumerated the ten commandments.  For this claim to be accepted and reported as real, the evidence should be overwhelming, because if true the claim would fundamentally change how many of us view existence itself, and our place in it.  Such a truth would merit the most serious reception, as it would replace the less than satisfying "sure and certain hope" with something a bit more reassuring!  That someone reports some sort of evidence should not be given a splash, or even much if any mention, in public media.  It should be reported in a technical science journal, leading to focused and deliberate consideration, confirmation or elaboration, and so on, before it is accepted and reported as fact that fundamentally changes our worldview.

That might happen had we a more responsibly run, less self-promoting, environment. The inflation-evidence claim would have properly appeared, quietly, in a technical cosmology journal, peer-reviewed, where it could be scrutinized and built into more elaborate theoretical and empirical understanding.  It should be given measured acceptance that leads to focused attempts to test, refute, or elaborate it.  No big-splash press headlines, no drooling over obvious Nobel prizes, no advertising are merited at this stage. Indeed, the author of the Nature commentary notes the role of hype in the premature inflation announcement, and that, in his view, at least the theory that was supposedly confirmed by the gravitational ripples is untestable, even in principle, whether or not it's consistent  with any kind of data one could collect.  Nonetheless, the cosmology industry will not tuck its tail between its legs and go on to more effective, even if less expensive, questions that can actually yield believable answers, quietly, and not give them public blare until they are solidly established and really deserve the attention.

Large amounts of funding are manipulated by the current way of doing science.   It is hard not to believe that this is intentional on the part of all parties--the proverbial mutual reinforcement society.  We know from personal experience that it is this way, quite consciously, in genomics and other biomedical sciences.  Grand claims trump quiet, focused research on problems that might help peoples' lives but are not cosmic in scale--there are many such genetic problems that should be solvable.  As it is now, rather than careful guardians of the public purse, the funders themselves are part of the hype game, to make sure their portfolios stay full (or are subject to inflation!), an attitude which should not be tolerated. Investigators hungry for funds to support their work sing the song that will get them their supper. Claims and papers are put out, structured, and timed so they can be cited as justification for further funds.  We all know we are doing this--'all' except, perhaps, the public who's paying for it.  But science should avoid this sort of secret hypocrisy.

A million multiverses don't exist, nor does a cure for cancer, just because some hasty press release says so.  Major findings, when really real, deserve all the praise one can give them.  Science, actual real results rather than advertising, is interesting enough in itself without needing hype and can be made so to the public, without all the candy wrappers it comes in these days.  The world can wait until the evidence is really sufficient.  It isn't exactly urgent that we know whether there was cosmic inflation 14 billion years ago.  We don't urgently need to identify all genes contributing a fraction of a percent to diabetes risk (far less than the risk reduction by eating better).

Not even the Republican budget-cutters have yet caught on, but if we're lucky, the public will demand it.  Funding agencies should be threatened with deep cuts if they don't do their jobs properly, and that would at least possibly frighten scientists into doing more work and less boasting, even if it takes time and major results are more elusive.  A more modest way of doing, and reporting, science would be more responsible, and would make our careers more satisfying, especially for new entrants to look forward to.  As things are done now, we've built a system in which too many glittering Big Bursts turn into the ashes of Big Busts.