Showing posts with label medicine. Show all posts
Showing posts with label medicine. Show all posts

Wednesday, September 8, 2021

Early Farmers Developed Muted Autoimmunity Responses

Uncontacted hunter-gatherers usually die of disease en masse, when encountering outsiders from the modern world. This is because they lack the immune system adaptation that were developed by the first farmers in the various Neolithic revolutions of the world. 

An ancient DNA study from the early Neolithic Vinca culture of the Balkans and other ancient European DNA samples documents those evolutionary adaptations. Specifically, evolution muted their global autoimmune responses were muted, while enhancing their localized inflammatory responses.

When early farmers of the Vinca culture first sowed barley and wheat 7700 years ago in the rich soil of the Danube River and its tributaries, they changed more than their diet: They introduced a new way of life to the region. They crowded together in mud huts, living cheek by rump with aurochs, cows, pigs, and goats—and their poop—in settlements that eventually swelled to thousands of people. Togetherness brought a surge in diseases such as influenza, tuberculosis, and other maladies spread from animals to people and through early farming communities.

Now a new study of ancient DNA shows how the immune systems of those early farmers responded to this new, pathogen-ridden environment. The Neolithic Revolution was a “turning point” in the evolution of immune responses to infectious disease, according to a paper published today in eLife. The study suggests that in Europeans, evolution favored genes that throttled back inflammatory reactions to pathogens like influenza, restraining the hyperalert inflammatory response that can be deadlier than the pathogen itself.

“This study does a great job of showing that our immune system has continued to evolve in response to pathogen pressure,” says population geneticist Joseph Lachance of the Georgia Institute of Technology. . . . 

Researchers have long suspected that early farmers got sick more often than nomadic hunter-gatherers. Studies suggest farmers in large Neolithic sites such as Çatalhöyük in Turkey faced a flurry of new zoonotic diseases such as influenza and salmonella, as well as new animal-borne strains of diseases like malaria and tuberculosis. “If farmers got sick more, how did their immune systems change?” asked infectious disease specialist Mihai Netea of Radboud University Nijmegen Medical Centre, who led the study.

To approach that question, his team first studied genetically based variation in the immune responses of living people. They took blood samples from more than 500 people in the Human Functional Genomics Project (HFGP), a biobank based in Nijmegen, Netherlands, and challenged the samples with various pathogens. Then they measured levels of specific cytokines—immunoregulatory proteins such as interleukin and interferon that are secreted by immune cells—and looked for correlations between those levels and a suite of immune gene variants.

In the new study, the team used those results to come up with what’s called a polygenic risk score that predicts the strength of the inflammatory response in the face of specific diseases, based on an individual’s immune gene variants. The researchers then applied their technique to the past: From existing databases they downloaded ancient DNA sequences from 827 remains found across Europe, including Vinca farmers from today’s Romania. They calculated the cytokine levels ancient people would likely have produced and their polygenic risk scores for inflammation.

The remains dated from between 45,000 and 2000 years ago, enabling the team to look for changes over time. They found that when faced with infections, Europeans who lived after agriculture likely produced dramatically lower levels of systemic cytokines than earlier hunter-gatherers. Those lower levels were likely adaptive, Netea says. “When people first encountered new pathogens, some overreacted and died, like we see with COVID today,” he says. “The children of the people who survived didn’t produce as many cytokines, so the whole population becomes more resistant.”

The study also revealed a flip side: When infected with the fungus Candida and Staphylococcus bacteria—pathogens that tend to start as localized infections—farmers likely mounted more robust inflammatory responses than earlier hunter-gatherers. A strong inflammatory response can quell a localized infection before it spreads, but a robust systemic response, as sparked by the flu or malaria, can spiral out of control.

From Science.org

The paper and its abstract are as follows:

As our ancestors migrated throughout different continents, natural selection increased the presence of alleles advantageous in the new environments. Heritable variations that alter the susceptibility to diseases vary with the historical period, the virulence of the infections, and their geographical spread. In this study we built polygenic scores for heritable traits that influence the genetic adaptation in the production of cytokines and immune-mediated disorders, including infectious, inflammatory, and autoimmune diseases, and applied them to the genomes of several ancient European populations. We observed that the advent of the Neolithic was a turning point for immune-mediated traits in Europeans, favoring those alleles linked with the development of tolerance against intracellular pathogens and promoting inflammatory responses against extracellular microbes. These evolutionary patterns are also associated with an increased presence of traits related to inflammatory and auto-immune diseases.

Jorge Domínguez-Andrés, et al., "Evolution of cytokine production capacity in ancient and modern European populations" eLife 10:e64971 (September 7, 2021). DOI: 10.7554/eLife.64971


Monday, July 26, 2021

The Common Cold Is Old

The common cold virus is much older than modern humans. 

The origins of viral pathogens and the age of their association with humans remains largely elusive. To date, there is no direct evidence about the diversity of viral infections in early modern humans pre-dating the Holocene. We recovered two near-complete genomes (5.2X and 0.7X) of human adenovirus C (HAdV-C), as well as low-coverage genomes from four distinct species of human herpesvirus obtained from two 31,630-year-old milk teeth excavated at Yana, in northeastern Siberia. 
Phylogenetic analysis of the two HAdV-C genomes suggests an evolutionary origin around 700,000 years ago consistent with a common evolutionary history with hominin hosts. 
Our findings push back the earliest direct molecular evidence for human viral infections by ∼25,000 years, and demonstrate that viral species causing common childhood viral infections today have been in circulation in humans at least since the Pleistocene.
From Sofie Holtsmark Nielsen, et al., "31,600-year-old human virus genomes support a Pleistocene origin for common childhood infections" bioRxiv (June 28, 2021).

Thursday, May 20, 2021

The Domestication Of The Opium Poppy

The opium poppy was not part of the Fertile Crescent Neolithic package of domesticated crops, but was added soon afterwards. 

[T]he opium poppy (Papaver somniferum L.) was domesticated in the western Mediterranean, where its presumed progenitor Papaver somniferum subsp. setigerum (DC.) Arcang is native and still grows wild today.

Using a new method of analysis, researchers from the universities of Basel and Montpellier have now been able to strengthen the hypothesis that prehistoric farmers living in pile dwellings around the Alps began to cultivate and use the opium poppy on a large scale from about 5500 BCE. By doing so, they contributed to its domestication, as the team reports in the journal Scientific Reports.

From this press release about  Ana Jesus, et al., "A morphometric approach to track opium poppy domestication." 11(1) Scientific Reports (2021) DOI: 10.1038/s41598-021-88964-4.

Monday, May 6, 2019

Pre-Columbian Bolivians Used Psychoactive Drugs

Psychoactive drugs weren't particular to the Old World. They were used by shamans in the highlands of the Andes Mountains around 1000 CE.
In a Bolivian rock shelter likely used 1,000 years ago for religious rituals, archaeologists found a collection of drug paraphernalia that still contains traces of psychoactive plants. A pouch made from three fox snouts likely contained a stash of leaves and seeds. From New Scientist
(The items) include a 28-centimetre-long leather bag, a pair of wooden snuffing tablets, a snuffing tube, a pair of llama-bone spatulas, a textile headband, fragments of dried plant stems and a pouch made from three fox snouts stitched together. The snuffing tube and tablets feature ornate carvings of human-like figures. 
Melanie Miller at the University of Otago, New Zealand, and her colleagues used mass spectrometry to analyse samples from the pouch and plant stems. They detected five psychoactive compounds: cocaine, benzoylecgonine (BZE), bufotenine, harmine and dimethyltryptamine (DMT).
From here.


A picture of the pouch described above.

The source is: "Chemical evidence for the use of multiple psychotropic plants in a 1,000-year-old ritual bundle from South America" (PNAS).

DMT (the active ingredient in Ayahuasca) is a psychedelic drug in the same effect category as LSD and "magic mushrooms." It is often used in connection with harmine.

In large doses, harmine can cause agitation, excessively low or high heart rates, blurred vision, hallucinations, dangerously low blood pressure, and a tingling, pricking, chilling, burning, or numb sensation on the skin with no apparent physical causeIn low doses it is a cancer drug, an anti-depressant of the MAOI variety, and promotes bone growth.

Bufotenine is chemically a close relative of DMT found in toad secretions (it can be lethal in high doses) and in certain South American seeds.

Cocaine is widely used illegally as a general nervous system stimulant and is considered highly addictive when used in that manner at the doses common in those uses. Benzoylecgonine (BZE) is a metabolite of cocaine.

Given the mix of compounds that were found together, and their effects, the hypothesis that this was the drug stash of a shaman is very plausible. This fits attested examples of how these compounds have been used by indigenous South American people.

Thursday, May 31, 2018

Vitamin D Deficiencies Linked To Miscarriage And Reduced Fertility

One of the big mysteries of the Bronze Age is why lactose persistence and light skin color were so strongly selected for in Europe. Both traits reduce the risk of Vitamin D deficiency. But, why would Vitamin D deficiency have such a strong selective effect?

One reason is that Vitamin D deficiency is linked to miscarriage and to reduced fertility.
Women who had sufficient preconception vitamin D concentrations were 10 percent more likely to become pregnant and 15 percent more likely to have a live birth, compared to those with insufficient concentrations of the vitamin. Among women who became pregnant, each 10 nanogram per milliliter increase in preconception vitamin D was associated with a 12-percent lower risk of pregnancy loss.
A typical woman can have about eight kids plus a normal number of miscarriages in a lifetime. This would have been common for Bronze age farmers (herders spaced their children a bit further apart than farmers did, making each lost child even more devastating). Also Bronze Age women who didn't have enough Vitamin D probably had far less Vitamin D in their systems than modern women with Vitamin D deficiencies do.

When you are aiming for two or three children as most modern European families do, an extra couple of miscarriages will cause grief but won't impact your lifetime impact on the gene pool very much. But, if everyone needs to have eight children just to have two of them live to reach adulthood and have children of their own, extra miscarriages, together with increased difficulty getting pregnant, can dramatically reduce your reproductive fitness.

If miscarriage and reduced fertility were the primary problem with reduced Vitamin D, another secondary issue may have been immune system health which Vitamin D enhances, in a world with no vaccines or antibiotics to address infectious diseases, and where Vitamin C, another immune system enhancer, would have been almost entirely absent for much of the year in Northern Europe.

At any rate, this data point should be useful for quantifying the selective fitness impact of LP and light skin color genes in Northern Europe during the Bronze Age.

Saturday, September 9, 2017

The Voynich Manuscript Deciphered

Nicholas Gibbs convincingly argues in the Times Literary Supplement that he has deciphered the 16th century illustrated manuscript known as the Voynich manuscript. He argues that is a copied anthology of medical texts, focused on women's health that trace to classical period sources for the most part, and that the text mostly consists of abbreviations of Latin words found in the source texts.

Previous efforts to decode the manuscript have eluded researchers for many decades, if not centuries.

Sunday, March 26, 2017

There Was Intense Selection For Vivax Malaria Protective Genes In Africa 40kya

The Secret History Of Mankind's Struggle With Malaria In Africa

Now, vivax malaria is the second most common variety of malaria discussed in the article below (and is the most common type outside of Africa). According to Wikipedia first discussing malaria generally and then the two most common types of it:
The disease is widespread in the tropical and subtropical regions that exist in a broad band around the equator. This includes much of Sub-Saharan Africa, Asia, and Latin America. In 2015, there were 214 million cases of malaria worldwide resulting in an estimated 438,000 deaths, 90% of which occurred in Africa. . . .  
Although P. falciparum traditionally accounts for the majority of deaths, recent evidence suggests that P. vivax malaria is associated with potentially life-threatening conditions about as often as with a diagnosis of P. falciparum infection.
But, ca. 40,000 years ago in parts of Africa (not those less tropical areas where the Khoi-san bushmen lived), a genetic mutation protective against vivax malaria "conferred a selective advantage of about 4.3%, leading to effective fixation in about 8,000 years." 

A parallel story mostly involving P. falciparum is told in the genetics of similarly intense selective pressure on genes including sickle cell trait, thalassaemia traits, and glucose-6-phosphate dehydrogenase deficiency. The combined lethality of all kinds of malaria was thus almost certainly considerably greater than indicated by the selective advantage against one kind of malaria conferred by this gene (which related to what are called "Duffy antigens") alone.

Duffy antigen genes also have multiple applications in population genetics.

A Hint About Intra-African Human History?

Anatomically modern humans have been present in African from ca. 150,000 to 250,000 years ago and archaic hominins have been present in Africa for millions of years. Moreover, it is unlikely that malaria vivax was limited to modern humans. Many forms of malaria can also affect other great apes beside humans, and archaic hominins would have been very similar to modern humans in terms of traits that would have made them vulnerable to, or resistant to, modern humans.

So, the most notable aspect of the protective Duffy antigen genes in humans is not that they are present, but that they arose so recently and only after the founding population of non-Africans left the continent.

In particular, the TMRCA date for the Duffy antigen gene that has reached fixation in tropical Africans isn't that far from the 60,000 years before present TMRCA date for African Pygmies, an African population with one of the most basal divisions from other modern humans in Africa (together with the Khoisan people who are somewhat more basal) that lives mostly in the rainforests of the vast Congo River basin jungle of tropical Africa.

So, while it could be that modest population sizes meant that it simply took a long time for a protective Duffy antigen mutation to occur, it is also quite plausible that the timing is an indication that modern humans in Africa did not live in tropical areas heavily afflicted with malaria mosquitos until ca. 40,000 years ago (plus however long it took for a mutation to emerge once they started live there).

This, combined with knowledge about which parts of Africa were historically deserts or rain forests could shed a lot of light in the question of the historical range of modern humans and their archaic hominin ancestors in the time frame from their earliest evolution to ca. 40,000 years ago, ruling out much of West Africa and Central Africa from that range.

The Article

As explained by a new article revealing this fact:
The human DARC (Duffy antigen receptor for chemokines) gene encodes a membrane-bound chemokine receptor crucial for the infection of red blood cells by Plasmodium vivax, a major causative agent of malaria. Of the three major allelic classes segregating in human populations, the FY*O allele has been shown to protect against P. vivax infection and is at near fixation in sub-Saharan Africa, while FY*B and FY*A are common in Europe and Asia, respectively. Due to the combination of strong geographic differentiation and association with malaria resistance, DARC is considered a canonical example of positive selection in humans. 
Despite this, details of the timing and mode of selection at DARC remain poorly understood. Here, we use sequencing data from over 1,000 individuals in twenty-one human populations, as well as ancient human genomes, to perform a fine-scale investigation of the evolutionary history of DARC. 
We estimate the time to most recent common ancestor (TMRCA) of the most common FY*O haplotype to be 42 kya (95% CI: 34–49 kya). We infer the FY*O null mutation swept to fixation in Africa from standing variation with very low initial frequency (0.1%) and a selection coefficient of 0.043 (95% CI:0.011–0.18), which is among the strongest estimated in the human genome. We estimate the TMRCA of the FY*A mutation in non-Africans to be 57 kya (95% CI: 48–65 kya) and infer that, prior to the sweep of FY*O, all three alleles were segregating in Africa, as highly diverged populations from Asia and ≠Khomani San hunter-gatherers share the same FY*A haplotypes. We test multiple models of admixture that may account for this observation and reject recent Asian or European admixture as the cause. 
Infectious diseases have undoubtedly played an important role in ancient and modern human history. Yet, there are relatively few regions of the genome involved in resistance to pathogens that show a strong selection signal in current genome-wide searches for this kind of signal. We revisit the evolutionary history of a gene associated with resistance to the most common malaria-causing parasite, Plasmodium vivax, and show that it is one of regions of the human genome that has been under strongest selective pressure in our evolutionary history (selection coefficient: 4.3%). Our results are consistent with a complex evolutionary history of the locus involving selection on a mutation that was at a very low frequency in the ancestral African population (standing variation) and subsequent differentiation between European, Asian and African populations.
Kimberly F. McManus, et al., "Population genetic analysis of the DARC locus (Duffy) reveals adaptation from standing variation associated with malaria resistance in humans" PLOS Genetics (March 10, 2017).

Lethality

This suggests that vivax malaria was very lethal (or at least reproduction preventing) in tropical Africa at the time, killing 4.3% of the entire unprotected population each generation (and a higher percentage of unprotected people who were infected, as not every single person in each generation would have been infected).

About 6% of the population in malaria vulnerable areas of the world are infected with malaria each year also it is predominantly lethal in children aged five and younger who account for 70% of deaths from malaria. Infection rates are much higher in tropical Africa where about 90% of malaria deaths occur.

Infection rates were much higher, about 15%-30% each year as recently as the early 20th century in Africa and sometimes more; there was a dramatic drop in malaria from 11.4% to 0.4% from 1940 to 1942. It is this pervasive infection rate that makes it particularly deadly. "A 2002 report stated that malaria kills 2.7 million people each year, more than 75 percent of them African children under the age of five. "

Thus, vivax malaria would have been more lethal than

* the Spanish flu of 1918
* untreated whooping cough
* measles in modern developing countries 
* lassa fever
* mumps
* treated Dengue fever
* treated tularemia
* diphtheria
* botulism
* perhaps even untreated typhoid fever or SARS (if the infection rate wasn't that high).

Of course, this lethality estimate assumes that the mutation is 100% protective, which it probably isn't. So, due to its probable less than 100% infection rate and its probable less than 100% protectiveness, the actual lethality of vivax malaria in this time period was probably significantly greater than 4.3% per generation of unprotected people.

Modern malaria (all kinds) kills only about 1 in 300 of people infected with it - this ancient strain that lasted 8,000 years would have been more than 14 times more lethal than malaria is today.

It is also possible that the selective advantage may have had a significant fertility component as opposed to purely a lethality effect since in modern populations: "Malaria in pregnant women is an important cause of stillbirths, infant mortality, abortion [i.e. miscarriage] and low birth weight, particularly in P. falciparum infection, but also with P. vivax." If malaria infection rates were very high (as seems likely), these fertility effects could have given rise to a large share of the selective fitness benefit even if the lethality of an infection was only modest.

New World History - Disease As A Factor In The Slave Trade

This evolutionary history shaped the early days of the Americas by contributing to the history of slavery in the Americas. As a result of genetic adaptations to tropical diseases including vivax malaria, the mortality of African working on plantations in the American South, the Caribbean, and South America was lower than that of Europeans.

This lower mortality rate, in turn, is one of the factors that caused these parts of the America to develop an agricultural economy based upon African slave labor rather than European colonists as the Northern United States and Canada and some other parts of South America did.

Basque Genetics

Duffy antigen genes are one area among many in which Basque population genetics are distinctive (although some assumptions of this 2005 article are now outdated).
The Basques live at the western end of the Pyrenees along the Atlantic Ocean and are thought to represent the descendants of a pre-Neolithic people. They demonstrate marked specificities regarding language and genetics among the European populations. We review the published data on the population genetics and Mendelian disorders of the Basques. 
An atypical distribution in some blood group polymorphisms (ABO, Rhesus, and Duffy) was first found in this population. Subsequently, additional characteristics have been described with regard to proteins (enzymes and immunoglobulins) and the HLA system. The advent of molecular biology methods in the 1990s allowed further insights into Basque population genetics based mainly on Y-chromosome and mitochondrial DNA. In addition, the Basques demonstrate peculiarities regarding the distribution of various inherited diseases (i.e., unusual frequencies or founding effects). Taken together, these data support the idea of an ancient and still relatively unmixed population subjected to genetic drift.
Frederic Bauduer, J. Feingold, and Didier Lacombe, "The Basques: Review of Population Genetics and Mendelian Disorders" 77(5) Human Biology 619-637 (October 2005) (closed access).

Monday, February 6, 2017

Fat Tails Are The Norm

Scientific papers of all stripes routinely assume that the likelihood of low frequency events is "Gaussian" which is to say that the probability distribution of approximately a "normal" distribution which can be parameterized fully by a mean value and a standard deviation.

But, a recent study shows that, in real life, in complex fact patterns, probability distributions routinely have "fat tails" (a.k.a. "long tails") (i.e. the likelihood of extreme events is much greater than a normal distribution would suggest).
Published recently in the journal Royal Society Open Science, the study suggests that research in some of the more complex scientific disciplines, such as medicine or particle physics, often doesn't eliminate uncertainties to the extent we might expect. 
"This is due to a tendency to under-estimate the chance of significant abnormalities in results." said study author David Bailey, a professor in U of T's Department of Physics. 
Looking at 41,000 measurements of 3,200 quantities -- from the mass of an electron to the carbon dating of a sample -- Bailey found that anomalous observations happened up to 100,000 times more often than expected. 
"The chance of large differences does not fall off exponentially as you'd expect in a normal bell curve," said Bailey.
The paper and its abstract are as follows:
Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student’s t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.
David C. Bailey. "Not Normal: the uncertainties of scientific measurements." Royal Society Open 4(1) Science 160600 (2017).

It isn't that the statistical law sometimes called the "law of averages" that make a Gaussian distribution seem reasonable, is inaccurate. But, the assumptions of that law often do not hold as tightly as our intuition suggest.

This could be because the events are not independent of each other, because systemic error is underestimated, because the measurements aren't properly weighted, because the thing being measured does not have sufficiently quantitatively comparable units, because look elsewhere effects aren't properly considered, or because the underlying distributions of individual events that add up to form the overall result are not simple binomial probabilities.

In particle physics this is handled by setting standards for nominal significance in error estimates assuming a Gaussian distribution that are far higher than what ought to be necessary to constitute a discovery (i.e. 5 sigma).

Lubos Motl also has a recent post on a similar subject which I won't attempt to summarize here, which distinguishes between probabilities with "fat tails" (when extreme events are actually more likely than in a normal distribution) and the application of the "precautionary principle" (which is used to justify assuming that unlikely bad events have relatively high probabilities and should be regulated when the exact probability can't be determined exactly) to justify regulations in a cost-benefit analysis.

Tuesday, December 6, 2016

Malaria Was Common In The Roman Empire

[R]esearchers have discovered genomic evidence of malaria in 2,000-year-old human remains from the Roman Empire, according to a new study
With DNA fragments from the teeth of 58 adults and 10 children buried in three imperial-period Italian cemeteries, researchers were able to recover the mitochondrial genome to identify the specific malaria species that infected people. 
Their data confirm that it was the malaria parasite Plasmodium falciparum, the same one that is spread by mosquitoes today and kills hundreds of thousands of people every year. Symptoms include fever, chills and flu-like illness. 
"Malaria was likely a significant historical pathogen that caused widespread death in ancient Rome," said evolutionary geneticist and study author Hendrik Poinar, director of the Ancient DNA Center at McMaster University in Hamilton, Ontario. 
The researchers estimate that malaria killed as many people during the Roman Empire as it does now in Africa. In 2015, there were an estimated 438,000 malaria deaths worldwide, with 91% of them occurring in sub-Saharan Africa, according to the World Health Organization.
From here.

The headline of the media report above asks if malaria caused the fall of the Roman Empire, which is pretty much definitively a "no" because the samples taken are from three to four centuries before the empire fell.

It is notable that there is only a very thin historical record of this massive public health problem despite the depth of contemporaneous records we have from this area in this time period. The paper and abstract are as follows:
The historical record attests to the devastation malaria exacted on ancient civilizations, particularly the Roman Empire. However, evidence for the presence of malaria during the Imperial period in Italy (1st–5th century CE) is based on indirect sources, such as historical, epigraphic, or skeletal evidence. Although these sources are crucial for revealing the context of this disease, they cannot establish the causative species of Plasmodium. Importantly, definitive evidence for the presence of malaria is now possible through the implementation of ancient DNA technology. As malaria is presumed to have been at its zenith during the Imperial period, we selected first or second molars from 58 adults from three cemeteries from this time: Isola Sacra (associated with Portus Romae, 1st–3rd century CE), Velia (1st–2nd century CE), and Vagnari (1st–4th century CE). We performed hybridization capture using baits designed from the mitochondrial (mtDNA) genomes of Plasmodium spp. on a prioritized subset of 11 adults (informed by metagenomic sequencing). The mtDNA sequences generated provided compelling phylogenetic evidence for the presence of P. falciparum in two individuals. This is the first genomic data directly implicating P. falciparum in Imperial period southern Italy in adults.
Stephanie Marciniak, et al., "Plasmodium falciparum malaria in 1st–2nd century CE southern Italy" 26(23) Current Biology 1220-1222 (December 5, 2016).

Monday, October 17, 2016

HPV Strain 16A Was Acquired Via Sex With Archaic Hominins

Every human suffers through life a number of papillomaviruses (PVs) infections, most of them asymptomatic. A notable exception are persistent infections by Human Papillomavirus 16 (HPV16), the most oncogenic infectious agent for humans and responsible for most infection-driven anogenital cancers. Oncogenic potential is not homogeneous among HPV16 lineages, and genetic variation within HPV16 exhibits some geographic structure. However, an in-depth analysis of the HPV16 evolutionary history is still wanting. 
We have analysed extant HPV16 diversity and compared the evolutionary and phylogeographical patterns of humans and of HPV16. We show that codivergence with modern humans explains at most 30% of the present viral geographical distribution. The most explanatory scenario suggests that ancestral HPV16 already infected ancestral human populations, and that viral lineages co-diverged with the hosts in parallel with the split between archaic Neanderthal-Denisovans and ancestral modern human populations, generating the ancestral HPV16A and HPV16BCD viral lineages, respectively. 
We propose that after out-of-Africa migration of modern human ancestors, sexual transmission between human populations introduced HPV16A into modern human ancestor populations. We hypothesise that differential coevolution of HPV16 lineages with different but closely related ancestral human populations and subsequent host-switch events in parallel with introgression of archaic alleles into the genomes of modern human ancestors may be largely responsible for the present-day differential prevalence and association with cancers for HPV16 variants.
Ville N. Pimenoff, Cristina Mendes de Oliveira, Ignacio G. Bravo. Transmission Between Archaic and Modern Human Ancestors During the Evolution of the Oncogenic Human Papillomavirus 16. Molecular Biology and Evolution (2016).

This scenario finally explains unsolved questions: why human diversity is largest in Africa, while HPV16 diversity is largest in East-Asia, and why the HPV16A variant is virtually absent in Sub-Saharan Africa while it is by far the most common one in the rest world. . . .  Since HPVs do not infect bones, current Neanderthal and Denisovan genomes do not contain HPVs. As a next step, the authors hope to trace HPVs sequences in ancient human skin remains as a more direct test of their hypothesis.
Analysis

While diseases originating in other species are hardly new, a solid link between archaic admixture and a specific sexually transmitted disease that modern humans received from archaic hominins is unprecedented.

It is also in accord with existing archaic DNA evidence showing some level of admixture between Neanderthals and Denisovans which would have allowed whichever of the species harbored HPV16A (if they did not share it from a common ancestor) to bring it in the other species.

Also, given that archaic admixture took place ca. 50,000 to 75,000 years ago, while HPV16A is still killing tens of thousands, if not more, people each year, it demonstrates that immune response and natural selection are not all powerful, particularly in the case of relatively low lethality infections that often strike after someone has already reproduced.

This development also rekindles curiosity regarding disease exchange following first contact with archaic hominins in general. Did modern human diseases ravage archaic hominins in the way that European first contact with Native Americans did? Did the reverse happen, or were both species seriously impacted by the new diseases that they respectively encountered?

Usually, we assume that modern humans either outcompeted or killed off archaic hominins, or that climate and the like had already established an archaic hominin bottleneck, but new diseases could have similar effects, in most cases through intraspecies transmission of the new diseases even before first contact.

UPDATE October 19, 2016: Vox covers the story with more panache: "Neanderthals may have given us their genital warts. Gee, thanks. To be fair, we may have given them diseases that ultimately led to their extinction." by Brian Resnick.

UPDATE (October 31, 2016): John Hawks has an interesting follow up observation:
There is, by the way, the interesting question of whether Neandertal immune variants might influence susceptibility to the strain in question, which has made little inroad into sub-Saharan Africa.

Monday, October 5, 2015

Parasites

The 2015 Nobel Prize for Medicine goes to researchers who found novel treatments for diseases caused by parasites.  The map in the linked article showing where parasitic diseases are a concern. This seems not very different from some important geographic regions in the study of prehistory, suggesting that parasitic diseases could be an important big picture factor in shaping prehistory, especially after accounting for shifting climates over time.

Monday, February 18, 2013

Genes, Mothers And Lovers

The reality that mothers never completely leave behind ties to the fathers of their children is not just a social reality. It is a biological one.
Microchimerism is the persistent presence of a few genetically distinct cells in an organism. This was first noticed in humans many years ago when cells containing the male “Y” chromosome were found circulating in the blood of women after pregnancy. Since these cells are genetically male, they could not have been the women’s own, but most likely came from their babies during gestation.

In this new study, scientists observed that microchimeric cells are not only found circulating in the blood, they are also embedded in the brain. They examined the brains of deceased women for the presence of cells containing the male “Y” chromosome. They found such cells in more than 60 percent of the brains and in multiple brain regions. . .

Thursday, April 26, 2012

Genetic Diabetes Risk Greater For Africans Than Asians


The Global Distribution of Type Two Diabetes Risk Alleles (from Chen, et al. (2012)

Type Two Diabetes And Genotypes

Diabetes is a disease characterized by the inability of the body to use insulin to properly manage blood glucose (i.e. blood sugar) levels, primarily associated with pancreas function, although some new chemical pathways that play a part in the conditions have been discovered. It can be managed with insulin shots and careful restriction of sugar in one's diet (or foods that quickly metabolize to sugar), and in worst case scenarios when not managed well, can lead to diabetes shock and comas, to poor circulation leading to loss of limb function or even limbs themselves, and to kidney failure that must be treated with dialysis (i.e. having a machine mechanically treat your blood in the way that internal organs should, typically for many hours several times a week).  Mismanaged diabetes is deadly.

There are two main kinds of diabetes. Type one diabetes is associated with poor insulin production that most often manifests early childhood and is essentially treatable but incurable, although scientists keep trying. Type two diabetes is the adult onset form that is strongly associated with obesity and other particular dietary imbalances. Diabetes is also sometimes a complication of pregnancy. A number of common genetic variants have been associated with diabetes risk.

A new open access paper at PLoS Genetics (Chen R, Corona E, Sikora M, Dudley JT, Morgan AA, et al. (2012) Type 2 Diabetes Risk Alleles Demonstrate Extreme Directional Differentiation among Human Populations, Compared to Other Diseases. PLoS Genet 8(4): e1002621. doi:10.1371/journal.pgen.1002621) shows that essentially every known gene associated with risk for Type Two Diabetes is more common in Africans than in Asians. As the abstract explains:
Many disease-susceptible SNPs exhibit significant disparity in ancestral and derived allele frequencies across worldwide populations. While previous studies have examined population differentiation of alleles at specific SNPs, global ethnic patterns of ensembles of disease risk alleles across human diseases are unexamined.  
To examine these patterns, we manually curated ethnic disease association data from 5,065 papers on human genetic studies representing 1,495 diseases, recording the precise risk alleles and their measured population frequencies and estimated effect sizes. We systematically compared the population frequencies of cross-ethnic risk alleles for each disease across 1,397 individuals from 11 HapMap populations, 1,064 individuals from 53 HGDP populations, and 49 individuals with whole-genome sequences from 10 populations.  
Type 2 diabetes (T2D) demonstrated extreme directional differentiation of risk allele frequencies across human populations, compared with null distributions of European-frequency matched control genomic alleles and risk alleles for other diseases. Most T2D risk alleles share a consistent pattern of decreasing frequencies along human migration into East Asia. Furthermore, we show that these patterns contribute to disparities in predicted genetic risk across 1,397 HapMap individuals, T2D genetic risk being consistently higher for individuals in the African populations and lower in the Asian populations, irrespective of the ethnicity considered in the initial discovery of risk alleles.  
We observed a similar pattern in the distribution of T2D Genetic Risk Scores, which are associated with an increased risk of developing diabetes in the Diabetes Prevention Program cohort, for the same individuals. This disparity may be attributable to the promotion of energy storage and usage appropriate to environments and inconsistent energy intake. Our results indicate that the differential frequencies of T2D risk alleles may contribute to the observed disparity in T2D incidence rates across ethnic populations.
A New Paradigm For Population Level Epidemiology

In most studies of disease risk, disease incidence is known, and a potential ancestry based risk which could be due to either nature or nurture is inferred from disease incidence data after controlling for known environmental risk factors (e.g. diet), but genotype is not directly measured. This paper is direct study of known genotypes.

It is beyond reasonable dispute that Africans (and people with African descent) have greater frequencies of genes believed to present a risk for type two diabetes, while East Asians, and people of East Asian descent have a lower frequency of these genes.

It is possible that there are unknown confounding genes or dietary practices or cultural practices that prevent these genes from manifesting in a Type Two Diabetes phenotype, or alternatively make low risk individuals more prone to developing Type Two Diabetes than the known Two Two Diabetes risk genes would suggest.

An analogous case involving an unknown confounding gene occurs for lactase persistence. A significant number of Africans who are lactase persistent (i.e. continue to have no problem drinking cows milk as adults), which is closely associated with specific known genes in European, lack the main European LP genes. This is because there are other African variants of the LP gene (some not specifically identified) that serve the same purpose. This scenario is plausible because African gene-disease associations are less well studied, in general, than European and Asian gene-disease associations.

An analogous case of a dietary practice confound is the case of cardiovascular disease in Parisians. The residents of Paris, France have diets that are high in fats known to be important risk factors for cardiovascular disease. But, they don't have an incidence of those diseases that reflects that level of fat consumption. The main reason for the disparity appears to be that Parsians also drink lots of red wine and that some combination of the alcohol and the other contents of the wine counteract the dangers of a high fat diet.

Wider Implications Related To Diabetes

But, the null hypothesis would be that a significant share of the known racial variation in type two diabetes incidence flows from the rates at which type two diabetes risk alleles are present, and since the impact of these type two diabetes risk alleles has been quantified in most cases, it should be possible to statistically distinguish between the racial variation in type two diabetes incidence due to known genetic risks from the variation due to environmental factors and undiscovered hereditary factors: the reverse of the usual epidemiological paradigm.

A study conducted with the methodology used in this paper would likely find, for example, that a significant share of of the type two diabetes incidence among African-Americans in the American South which has been previously attributed with old paradigm epidemiological methods to poor choices in diet and exercise in epidemiological studies conducted without genotype information may actually be due to differences in genotypes between African-Americans and whites.

Of course, while the proportion of the two two diabetes incidence rates attributable to heredity rather than diet and exercise may shift, the practical response is much the same.

In essence, a person with type two diabetes risk genes is someone who can't escape the disease consequence of suboptimal diet and exercise choices to the extent as someone who lacks those genes. For example, a full blooded Korean American woman with the same sedentary office worker lifestyle and moderately high sugar and fat and calorie diet is probably quite a bit less likely to get Type Two Diabetes than an African American woman with a typical level of African and non-African admixture, whose activity and diet are precisely the same. Proof, once again, that life is not in the least bit fair.

Put another way, on average people with African descent need to pay more attention to lifestyle risk factors for type two diabetes and obesity to avoid ill health effects than the average person.

Implications For Racial Disparity In Other Disease Risk Genotypes

On the other hand, with a few other minor exceptions (sickle cell anemia vulnerability, for example, which is more common in Africans because the same gene that causes the disease also provides resistance of malaria), a not very clearly stated implication of this study (given that it looked at genotype studies of 1,495 diseases and that Type Two Diabetes stood out as the most noteworthy of them) is that almost no other common diseases with known common SNP allele genetic risk factors have such a starkly ancestry linked pattern of genotypic risk. Type Two Diabetes appears to be something of a worst case scenario.

The default assumption when looking for geographic pattern in different diseases which are not associated with diet and metabolism or infectious diseases of a geographically constricted range, is that should be that genotype is much more loosely linked to geography, race or deep ancestry.

Also, since the source populations of particular areas of Europe or Asia often have much smaller effective populations and have shown serial founder effects, even very large populations in these areas are likely to have quite homogeneous patterns of disease vulnerability risk, so epidemiological models that focus on environmental factors may be more viable in these situations than populations in or near African, and multi-continental mixing pot populations in the New World.

Implications For Modern Human Evolution


Which adaptations conferred the greatest fitness advantages?

If diet, metabolism and infectious disease risk are the primary distinctions in disease risk genotypes between Africans and non-Africans (and one can obviously add skin, eye and hair coloration and type to the list), this implies that infectious disease and food supply have been some of the most powerful factors in the evolutionary selection on modern humans in the post-Out of Africa era, while many other hypothetically plausible targets of evolutionary selection in modern humans in this era turn out to have been largely irrelevant to evolutionary fitness for modern humans in this era.

When and where did these adaptations become common?

Of course, knowing that there is a distinction doesn't necessarily tell us when that distinction arose.  Did it arise in the Middle Paleolithic, when modern humans left Africa; did it arise in the Upper Paleolithic, when modern humans settled in Europe, Australia, Papua New Guinea, Japan and the Americas for the first time; or did it arise in the Neolithic together with the shift from a hunting and gathering mode of food production to a farming and herding mode of food production, or could it be an even more recent metal age development?

The maps, which show Papua New Guinea and the Americas to be largely congruent with the Asian pattern suggest that these genotype differences arose at least as far back as the Upper Paleolithic and prior to the Neolithic revolution in Asia.  Otherwise, American and Papuan populations which were genetically isolated from Neolithic populations until a few hundred years ago, would look more like the African populations and less like the Asian ones.

Although it is harder to eyeball, there appear to be (and the charts in the body of the paper support the conclusion that there are) lower frequencies of type two diabetes risk genes in areas that have "Southern Route" population histories in Asia, and intermediate frequencies of type two diabetes risk genes in Europe and areas with ties to Central Asia that were at some point or another experienced a significant and lasting Indo-European presence.

This suggests a two step selection process - one common to all non-Africans and possibily diluted at the African fringe by short distance migration, and a second one particular to non-Africans who got far enough along on a Southern route to make it past South Asia and were subject to heightened selective pressure. 

Alternately, one could also imagine a scenario in which many of the common type two diabetes alleles emerge somewhere Asia where they approach fixation.  The presence of these alleles at all in other parts of the Old World could be entirely due to back migration from Asia.

Indeed, while the distribution of these genes disfavors an association with Denisovian admixture which is much more narrowly distributed, although one could also fit a Neanderthal source to these genes to their distribution quite easily so long as one assumed that they conferred more selective advantage in Asia than in Europe.

To the extent that these were a disease resistance alleles that had selective advantage for non-Africans other than Asians (even if the advantage was not as great for Europeans than for East Asians, for example), it is worth noting that the overall genetic contribution of the back migrating population could have been much smaller than their percentage contribution of specific alleles at these specific locations in the genome where the percentages would be amplified over tens of thousands of years by the fitness advantage that they conferred.

For example, even though 40% of Europeans have some type two diabetes risk reduction allele that has reached near fixation in China today, that could easily have emerged from a back migration that is a source for only 4% or less of the overall whole genome of Europeans.

Such a back migration could easily have happened, for example, in a period that was some time after the Toba erruption (ca. 74,000 years ago) which is a plausible geoclimatological event that might have coincided with the arrival of modern humans in Southeast Asia from South Asia, but was long before modern humans started to displace Neanderthals in Europe (ca. 42,000 years ago). Hence, the presence of these disease risk reducing alleles may date back to a period when the back migration was to a proto-European population in South Asia, Iran or the Middle East, rather than to modern humans who actually lived in Europe at the time.

What kind of events could have caused these genes to confer a fitness advantage?

Perhaps the genes approach fixation because at some point in ancient prehistory in Asia only holders of the genes could survive some genetic bottleneck in significant numbers, or because the fitness enhancement was greater given the foods available in Asia than in Europe and the Middle East.

For example, perhaps there are more plants with natural sweet sugars in Asia than Europe and the Middle East, so an ability to manage blood sugar levels became more important.

What would a bottleneck like that look like?  I'd imagine one where the ability to be both obese and healthy in the long term would provide an advantage.  Thus, you'd imagine perhaps a hunting and gathering population that experienced both "fat times" and "lean times" where the best adapted individuals got quite obese in the fat times, without developing disadvantageous diabetes, and then were able to survive lean times as a result of having these great food reserves that the peers who either got fat and died from diabetes, or didn't get fat in the first place, lacked.  These genes may have served a purpose analogous to the one served by a camel's ability to store water internally for long periods of time after gorging itself.

At a generalized level, type two diabetes risk reduction alleles might confer fitness primarily by providing a survival advantage in circumstances when food supplies are unreliable, something that may not have been nearly so much of a selective pressure for Africans many of whom may have enjoyed (and still enjoy) a more stable tropical and subtropical climate and seasons that had less of an impact on food supplies for hunters and gatherers.  Those inclined to put things in Biblical metaphor could describe Africa as an Eden and the Eurasians as exiles from Eden who faced greater struggles to meet their needs from nature that made it fitness enhancing to have these alleles.

And, the Edenic nature of the food supply in Africa wouldn't have been coincidental.  Modern humans, our hominin predecessors, and our primate predecessors evolved over millions of years to be ideally suited to African life.  We may have been better able to find food all year around, year after year in Africa, because only the ancestors who learned to hunt, gather and eat a wide enough range of foods to do so thrived and were rewarded by the evolutionary process.  Had modern humans evolved in Southeast Asia instead of Africa, for example, perhaps we would, like pandas, be able to digest bamboo, for example.  But, since we evolved in Africa, rather than in Europe or Asia, even our omnivorous diet may been able to secure nourishment from a smaller percentage of the biomass that could have been food there than it did in Africa.  And, the smaller the percentge of biomass one can digest, the more unstable one's food supply will be and the more one needs to be able to store calories in fat in good times so one can survive later in hard times.

One More Piece of Evidence Added To The Clues That Reveal Pre-History

The selectively driven type two diabetes risk allele distributions (and despite my lumping together of these alleles in the discussion above, there are really at least six separate distributions to consider which could have spread to world populations in two or more separate events) provide another tool, on top of uniparental genetic markers, autosomal genome data, and other genes that have known functions with geographically distinctive distributions (like lactase persistence genes and blood types) that allow us to make inferences (bound by the limitations that the inferences be consistent somehow), about human pre-history that we are pressed to understand with the very sparse available archaeological evidence.

Thursday, January 5, 2012

New Finding Hints At Common Mechanism in Alzheimer's And Autism

The amyloid precursor protein is typically the focus of research related to Alzheimer's disease. However, recent scientific reports have identified elevated levels of the particular protein fragment, called, sAPP-α, in the blood of autistic children. The fragment is a well-known growth factor for nerves, and studies imply that it plays a role in T-cell immune responses as well.

From here.

Abnormal immune function has been noted in children with autism before, but no cause had previously been identified. The new study suggests that this protein that is already a biomarker for Alzheimer's disease may also be a biomarker for autism. Alzheimer's disease is the most well known form of geriatric dementia, although pre-clinical signs of it may start to manifest as early as young adulthood.

Autism is typically first diagnosed in preschool children and narrow definition autism is a characteristic subtype of developmental disability found in 1 in 110 children that disproportionately affects boys that is part of an "Autism spectrum" that is found in more children and at the milder end is sometimes described as a form of mere neurodiversity.

The research suggest that it may be possible in a few years to do a blood test for autism, allowing for earlier diagnosis, which could be helpful if earlier treatments have a better chance of being effective, and could also reduce the risk of misdiagnosis leading to inappropriate treatment.

Wednesday, August 31, 2011

Opium Used As Drug 4500 Years Ago In Spain

There is evidence of opium use as a drug (medicinal and ritual) from 4,500 years ago in Spain at several sites in Andalusia and Catalonia. The use of opium use as a drug was previously known in the Daunbian Neolithic of Eastern Europe, but evidence of this use had not been found in Southwest Europe. The find predates the arrival of Celtic culture in the region, which is frequently viewed as the point at which the Indo-European languages arrived in Iberia, and instead was present at the mid-to-late stone age farming stage of civilization there.