Showing posts with label neurodiversity. Show all posts
Showing posts with label neurodiversity. Show all posts

Thursday, June 15, 2023

Brain Size v. Body Size In Mammals

It turns out that the relationship between brain size and body size in mammals is pretty simple and universal if you don't insist on linear relationships.
Despite decades of comparative studies, fundamental questions remain about how brain and body size co-evolved. Divergent explanations exist concurrently in the literature for phenomena such as variation in brain relative to body size, variability in the scaling relationship across taxonomic levels and between taxonomic groups, and purported evolutionary trends. Here we resolve these issues using a comprehensive dataset of brain and body masses across the mammal radiation, and a method enabling us to study brain relative to body mass evolution while estimating their evolutionary rates along the branches of a phylogeny. 
Contrary to the rarely questioned assumption of a log-linear relationship, we find that a curvilinear model best describes the evolutionary relationship between log brain mass and log body mass. This model greatly simplifies our understanding of mammalian brain-body co-evolution: it can simultaneously explain both the much-discussed taxon-level effect and variation in slopes and intercepts previously attributed to complex scaling patterns. 
We also document substantial variation in rates of relative brain mass evolution, with bursts of change scattered through the tree. General trends for increasing relative brain size over time are found in only three mammalian orders, with the most extreme case being primates, setting the stage for the uniquely rapid directional increase that ultimately produced the computational powers of the human brain.
Chris Venditti, Joanna Baker, Robert A. Barton, "Co-evolutionary dynamics of mammalian brain and body size" bioRxiv (June 9, 2023) https://doi.org/10.1101/2023.06.08.544206

Friday, September 9, 2022

Key Genetic Differences Between Humans And Other Hominins

We are starting to reach the point where comparisons of modern human DNA and ancient DNA can tell us fairly precisely how we differed from archaic hominins and which differences mattered the most. The New York Times explains the latest development on this front:

Scientists have discovered a glitch in our DNA that may have helped set the minds of our ancestors apart from those of Neanderthals and other extinct relatives.

The mutation, which arose in the past few hundred thousand years, spurs the development of more neurons in the part of the brain that we use for our most complex forms of thought, according to a new study published in Science on Thursday.

“What we found is one gene that certainly contributes to making us human,” said Wieland Huttner, a neuroscientist at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, and one of the authors of the study.
The most obvious feature of the human brain is its size — four times as large as that of chimpanzees, our closest living relatives.

Our brain also has distinctive anatomical features. The region of the cortex just behind our eyes, known as the frontal lobe, is essential for some of our most complex thoughts. According to a study from 2018, the human frontal lobe has far more neurons than the same region in chimpanzees does.

But comparing humans with living apes has a serious shortcoming: Our most recent common ancestor with chimpanzees lived roughly seven million years ago. To fill in what happened since then, scientists have had to resort to fossils of our more recent ancestors, known as hominins.

Inspecting hominin skulls, paleoanthropologists have found that the brains of our ancestors dramatically increased in size starting about two million years ago. They reached the size of living humans by about 600,000 years ago. Neanderthals, among our closest extinct hominin relatives, had brains as big as ours. . . .

But Neanderthal brains were elongated, whereas humans have a more spherical shape. Scientists can’t say what accounts for those differences. One possibility is that various regions of our ancestors’ brains changed size. . . .

In recent years, neuroscientists have begun investigating ancient brains with a new source of information: bits of DNA preserved inside hominin fossils. Geneticists have reconstructed entire genomes of Neanderthals as well as their eastern cousins, the Denisovans.

Scientists have zeroed in on potentially crucial differences between our genome and the genomes of Neanderthals and Denisovans. Human DNA contains about 19,000 genes. The proteins encoded by those genes are mostly identical to those of Neanderthals and Denisovans. But researchers have found 96 human-specific mutations that changed the structure of a protein.

In 2017, Anneline Pinson, a researcher in Dr. Huttner’s lab, was looking over that list of mutations and noticed one that altered a gene called TKTL1. Scientists have known that TKTL1 becomes active in the developing human cortex, especially in the frontal lobe
. . .

For their final experiment, the researchers set out to create a miniature Neanderthal-like brain. They started with a human embryonic stem cell, editing its TKTL1 gene so that it no longer had the human mutation. It instead carried the mutation found in our relatives, including Neanderthals, chimpanzees and other mammals.

They then put the stem cell in a bath of chemicals that coaxed it to turn into a clump of developing brain tissue, called a brain organoid. It generated progenitor brain cells, which then produced a miniature cortex made of layers of neurons.

The Neanderthal-like brain organoid made fewer neurons than did organoids with the human version of TKTL1. That suggests that when the TKTL1 gene mutated, our ancestors could produce extra neurons in the frontal lobe. While this change did not increase the overall size of our brain, it might have reorganized its wiring.

As a post-script, it is hard to understate the incredibly advanced the work is to extract ancient DNA samples, and to make sense of the DNA.

Thursday, June 30, 2022

What Drives The Distribution Of Tonal Languages And Correlated Phonetic Features?

This post was originally started and mostly written a few years ago. It is refined, expanded, and published now. (I've now cleared my backlog of draft posts.)

Languages with Complex Tone Systems

Languages With Simple Tone Systems

Languages Without Tone Systems


Languages with labial-velar consonants in yellow; 
languages with clicks in red and black.


Languages with glottal consonants other than ejectives
Purple and yellow have implosives only; 
red and white have glottalized resonants only; 
green and aqua have implosives and glottalized resonants.

Charts via WALS Online.

What is a tonal language?

In a tonal language, tone is the term used to describe the use of pitch patterns to distinguish individual words or the grammatical forms of words, such as the singular and plural forms of nouns or different tenses of verbs.

Tonality appears to be a part of a total phoneme set for a language which also includes a language's inventory of consonants, glottal stops, vowels, and click sounds. 


* The average language with a complex tone system has 26.0 consonants and 7.05 vowels, for a total of 33.05 phonemes.
* The average language with a simple tone system has 23.3 consonants and 6.28 vowels, for a total of 29.58 phonemes.
* The average language with no tone system has 22.1 consonants and 5.58 vowels, for a total of 27.68 phonemes.

Tonal languages also tend to be more likely to have implosive consonants (a type of glottal consonant), glottal resonant consonants, labial-velar consonants, and linguistic click consonants.

Where Are Tonal Languages Spoken?

As the charts at the top of this post demonstrate, tone languages tend to be more vastly common in places that with tropical (or at least subtropical) climates and very rare elsewhere. 

Languages with simple tone systems show a similar, but less pronounced tendency. All of the tonal languages outside tropical and subtropical areas have only simple tone systems. 

Some of the more controversial cases are arguable cases of simple tone systems in places where tonal languages are rare.  A dozen of the languages classified as having simple tone systems are among the most geographically atypical and are only marginally tonal to the extent that they arguably would be more properly classified as non-tonal. These include Norwegian, Japanese, Ainu and Oneida (Iroquoian; New York State).

Much of South Asia, however, despite having many languages, has a tropical or subtropical climate, but appears to have no languages with a complex tone system among its many Indo-European, Dravidian, or Austroasiatic languages, although it does have a handful of Sino-Tibetan languages with simple tone systems in the highlands found in the Himalayas and in the far northeast of the subcontinent. 

There are definitional issues about what constitutes a language with a complex tone system, a simple tone system or no tone system. WALS explains its definitions (emphasis added):
The first distinction made in this chapter is between languages with and languages without tones. For most languages it is easy to determine if the language does or does not make use of tone, but there are surprisingly sharp disagreements in certain cases. 
For example, Dar Fur Daju (Nilo-Saharan; Sudan) is reported as non-tonal in one source but transcribed with three tone levels in another. Ket (Yeniseian; northern Siberia) is described as having none, two, four or eight tones by different authors (there are some differences in the dialects being described, but this does not account for the differences of opinion on the tonal status of the language). Both these languages have been counted as non-tonal in the present chapter since the opinion that they lack tones seems to be the most well-supported (see Thelwall 1981 and Feev 1998 respectively).  
Other languages have clear word-level pitch phenomena but with limited function, or with roles that look more like stress in that they highlight a particular syllable of a word. Norwegian, Japanese, Ainu and Oneida (Iroquoian; New York State) are among languages of this kind. These languages are classified here as tonal, but are perhaps only marginally so.  
Of the 526 languages included in the data used for this chapter, 306 (58.2%) are classified as non-tonal. This probably underrepresents the proportion of the world’s languages which are tonal since the sample is not proportional to the density of languages in different areas. 
For example, from the large Niger-Congo family of Africa there are 68 languages in the sample, 5 of which are nontonal (Swahili, Diola-Fogny, Koromfe, Wolof and Bisa) and the remainder tonal. The Ethnologue (Grimes 2000) lists 1489 Niger-Congo languages, so less than 5% of the Niger-Congo languages are included. 
Of the Indo-European languages of western and central Europe, 16 are included (5 Romance, 3 Germanic, 3 Slavic, 2 Celtic, 1 Baltic, Greek, and Albanian). In these Indo-European groups the Ethnologue lists a total of 145 languages (7 Celtic, 58 Germanic, 48 Italic, 18 Slavic, 7 Greek, 4 Albanian, and 3 Baltic languages), so that over 10% of the Western European languages listed are included, only two of which are tonal or marginally so and the rest non-tonal. 
If, correspondingly, 10% of the Niger-Congo family had been included, 80 additional tone languages would have been included. 
Languages without tones predominate in the western part of the Eurasian landmass, including South Asia, in the more southerly regions of South America, and in the coastal area of northwestern North America. In this last area great genealogical diversity exists among the indigenous languages, but tone is almost entirely absent. In addition, no Australian language has been reported to be tonal.  
The languages with tones are divided into those with a simple tone system — essentially those with only a two-way basic contrast, usually between high and low levels — and those with a more complex set of contrasts. 
About a quarter of the languages (132, or 25.1%) have simple tone systems. This includes 12 languages which appear to meet the definition of being tonal only marginally. With better information a few of these might end up being classed as non-tonal. 
Less than a fifth (88, or 16.7%) have complex tone systems. Tone languages have marked regional distributions. Virtually all the languages in Africa are tonal, with the greater number having only simple tone systems, although more complex systems are not unusual, especially in West Africa. Languages with complex tone systems dominate in an area of East and Southeast Asia. Several clusters of languages with tones occur in South, Central and North America. A number of the languages of New Guinea are also tonal, or at least marginally so.

Tonality Appears To Be Primarily An Areal Rather Than A Language Family Based Property Of Languages

There are language families in which some languages are tonal, while other are not.

As noted above, two Indo-European languages arguably have simple tone systems, although at least one of these is a marginal case with a dubious classification.

Only five of Africa's Niger-Congo languages do not have tone systems.

Within the Afroasiatic language family, tonal languages appear in the Omotic, Chadic, and Cushitic branches of Afroasiatic (the Southern tier of Afroasiatic languages, mostly in Ethiopia and the African Sahel), according to Ehret (1996), but the Semitic, Berber, and Egyptian branches do not use tones phonemically.

Most Austroasiatic languages are tonal, but not the Munda languages of South Asia and not five of the lesser known Austroasiatic languages of Vietnam and Laos.

The Austronesian languages aren't uniform with regard to tonality either: "Unlike in the languages of Mainland Southeast Asia, tonal contrasts are extremely rare in Austronesian languages. Exceptional cases of tonal languages are Moklen and a few languages of the Chamic, South Halmahera–West New Guinea and New Caledonian subgroups."

The locations have temperatures and humidities that influence sound transmission through the air, and have terrain influences (e.g. tree density) that impact how far away you would need words you speak to carry best. So, one theory is that tone languages arise in places where the sound transmission qualities of the air and terrain favor them.

There have been suggestions in the literature that the local climate and ecology can make certain phoneme sets better in some places than in others, that the nature of one part of a phoneme set influences the nature of other parts of the phoneme set, and that there are specific non-random factors that favor particular subtypes of phonemes in particular conditions.
An environmental explanation is supported by the observation that tonality in language seems to be more of an areal effect than one that tracks language families.  There is a fair amount of circumstantial evidence, when you look at patterns of semantic tone use globally in all sorts of languages, to suggest that tonality is more of an areal feature than it is an indicator of the ancestral source of a language. Neighboring languages that come from different families often share the feature of semantic tonality, while languages within the same language family often differ in their use of semantic tonality.
Incidentally, the geographic distribution of languages with tone systems is similar, although not identical, to the geographic distribution of languages with glottal consonants. Both are most common in sub-Saharan Africa, Southeast Asia, and the subtropical and tropical regions of the Americas (although the Americas are far from uniform despite all except the Na-Dene and Inuit language families probably having a common ancestor ca. 14kya). But, the Chinese dialect family uses tone, while it does not utilize glottal consonants. 
It could be that the ancestral hominin type ASPM gene correlated with tonal languages “tunes” ones hearing system to better distinguish sounds in a certain pitch range, in general, in places that that the temperatures, humidities and terrain most conducive to tonal languages, while the derived type ASPM gene loosens to focus of the hearing system so that it isn’t so primed to maximizing hearing of sounds in particular set of conditions, which would be adaptive elsewhere.
This is a more complex hypothesis than the one proposed in the paper showing a relationship between tonal languages and this gene. This 2007 paper and abstract are as follows:
The correlations between interpopulation genetic and linguistic diversities are mostly noncausal (spurious), being due to historical processes and geographical factors that shape them in similar ways. Studies of such correlations usually consider allele frequencies and linguistic groupings (dialects, languages, linguistic families or phyla), sometimes controlling for geographic, topographic, or ecological factors. 
Here, we consider the relation between allele frequencies and linguistic typological features. Specifically, we focus on the derived haplogroups of the brain growth and development-related genes ASPM and Microcephalin, which show signs of natural selection and a marked geographic structure, and on linguistic tone, the use of voice pitch to convey lexical or grammatical distinctions. 
We hypothesize that there is a relationship between the population frequency of these two alleles and the presence of linguistic tone and test this hypothesis relative to a large database (983 alleles and 26 linguistic features in 49 populations), showing that it is not due to the usual explanatory factors represented by geography and history. The relationship between genetic and linguistic diversity in this case may be causal: certain alleles can bias language acquisition or processing and thereby influence the trajectory of language change through iterated cultural transmission.

Earlier versions still of this gene are associated with brain size:
The size of human brain tripled over a period of approximately 2 million years (MY) that ended 0.2-0.4 MY ago. This evolutionary expansion is believed to be important to the emergence of human language and other high-order cognitive functions, yet its genetic basis remains unknown. An evolutionary analysis of genes controlling brain development may shed light on it. ASPM (abnormal spindle-like microcephaly associated) is one of such genes, as nonsense mutations lead to primary microcephaly, a human disease characterized by a 70% reduction in brain size. Here I provide evidence suggesting that human ASPM went through an episode of accelerated sequence evolution by positive Darwinian selection after the split of humans and chimpanzees but before the separation of modern non-Africans from Africans. Because positive selection acts on a gene only when the gene function is altered and the organismal fitness is increased, my results suggest that adaptive functional modifications occurred in human ASPM and that it may be a major genetic component underlying the evolution of the human brain.
The case that your ASPM variant enhances fitness primarily by making your hearing system better adapted to your primary environment makes more sense to me in an evolutionary selective fitness sense. If people with the region appropriate variant hear subtle slight sound differences better than people who lack it, that could increase the ability of a hunter-gatherer to locate prey, to detect predators, to locate lost children who have wandered far away, to hear your enemies coming to get you, to detect a fire that has gotten out of control or something that you are standing on that is about to break, and cumulatively, that could produce a gradual, put persistent selective fitness advantage in the evolutionary sense.
I find it harder to believe that the tone language specific application of this trait would have much of a selective fitness effect. An inability to distinguish by sound alone two words that would both make contextual sense in a tonal language that you and the speaker share might tweak one’s social status in the community a little, but it seems less likely to have a big impact on mortality or lifetime reproductive success. It’s not impossible, but it would seem like a weaker explanation.
Against this backdrop, the natural question to ask is one that wouldn’t otherwise be obvious, which is “why aren’t there more tonal language in South Asia?” which has substantial linguistic diversity and climate features in part of the region that are very similar to places in Africa, Southeast and East Asia, and the Americas where tonal languages are predominant.
One partial answer to this is that the Indo-Aryan languages developed in places that did not have this climate and didn’t spontaneously pick up this feature upon arriving in the subcontinent. 
Migration can also explain the affirmative presence of these features in arid southern Africa where people speaking these languages probably migrated from more tropical parts of Eastern sub-Saharan Africa.
But, this doesn’t explain why we don’t see tonal Munda and Dravidian languages in South Asia. The urheimat of the Austroasiatic languages of which the Munda languages are a family member, is Southeast Asia (or perhaps southern China), where the vast majority of languages are tonal. And, the Dravidian languages, as far as anyone knows, are autochthonous in South Asia.
In both of these exception cases, I think that the likely explanation is a language learner effect.
The Munda languages, at least initially, seem to have had a fairly northerly distribution within South Asia where hearing well suited to tonality wouldn’t have been advantageous to the local people who probably accounted for all or most of the women in the community at the time of first contact when the Munda languages would have been adopted by people integrated into the early Munda communities. If half the people had trouble hearing the tones, that feature which was almost surely present in an ancestral pre-Munda language probably didn’t survive.
In the case of the Dravidian languages, which probably have an ancestral version that was tonal under the environmental hypothesis, the pertinent fact is that there are no meaningful communities in Dravidian India that do not have substantial ANI admixture dating to the last 2000-3500 years. The language learner affect at the time of ANI-ASI admixture could have stripped the Dravidian languages in existence at the time of their tonal features for the same reasons. The ubiquity of the Hindu religion in Dravidian India which has clear Indo-Aryan and Harappan synthesis origins, likewise suggests that the language learners were not just anybody, they were culturally influential elites whose language choices tend to influence whole communities by cultural imitation.

Sunday, December 27, 2020

What Will Humans Look Like In 100,000 Years

A cute little speculative article imagines what humans might look like 100,000 years from now due to evolution after living in space for a while with a before and after picture.

Now


In 100,000 years

The article motivates its pictures as follows: 

In 2007, artist and researcher Nickolay Lamm partnered with computational geneticist Dr. Alan Kwan and came up with three illustrations. He first hypothesized what we might look like in twenty thousand years, the second in sixty thousand years, and the third 100 thousand years into the future.

According to Lamm, this is “one possible timeline” that takes into consideration both human evolution and advancements in technology and genetic engineering. Lamm and Kwan imagined a possible future where humans would have a much greater ability to control the human genome, and where their living environments might be much different than ours [1].

Here are some of the major changes that could happen, and the reasons why they might occur, according to Kwan and Lamm: 
A larger forehead 

The human forehead has been increasing in size since the fourteenth and fifteenth centuries. According to scientists, when you measure skulls from that time and compare them to our own, people today have less prominent facial features and higher foreheads [2]. It seems logical, then, to imagine a future where our skulls continue to grow to accommodate larger and larger brains. 
Changed Facial Features

Given the advancements we have already made in genetic engineering, Kwan based some of his hypotheses on the assumption that we will be even further ahead sixty thousand years from now. He argued that a greater ability to control the human genome will mean that evolution will have little effect on human facial features. Essentially, our faces will change depending on human preferences- like larger eyes, a straighter nose, and more symmetry between both sides of our faces. 
He also suggested that by then, it is possible that humans will have begun colonizing other planets. People living in places that are further from the sun, and therefore are less bright, may cause their eyes to get bigger to enhance vision. Skin may also be darker to lessen the damage from UV rays outside of earth’s protective atmosphere.
Additionally, he proposed that people will have thicker eyelids, and their frontal bone under their brow will be more pronounced. This will help humans to deal with the disruptive effects of cosmic rays. We already see these effects happening with today’s astronauts.

Kwan says that over the remaining forty thousand years, those features that humans selected for will become even more pronounced. One hundred thousand years from now, it is possible that humans’ eyes, for example, will seem unnervingly large compared to what we are used to today. 
Functional Necessities

Other, smaller changes may be things like larger nostrils. This will allow humans to breathe easier when they’re living on other planets. People may have denser hair to keep their larger heads warm. In an age, however, when you can genetically alter almost any feature about yourself, Kwan suggests that features make us look naturally human will become more favorable.

It is an interesting exercise although it misses some obvious points.

Time Horizons

The notion that this would happen over 100,000 years based upon past experience of the human species is not very credible for the kind of basic visual phenotypes focused upon.

Most of the common phenotypes associated with major "racial" types today evolved much more rapidly. There was no one in Europe who looked like a typical modern Northern European in 4000 BCE. 

Bantu expansion in Africa, in the same time period, caused one West African phenotype to become predominant in most of sub-Saharan Africa, leaving only small relict populations of "Paleo-Africans" like pygmies and Khoi-San people, and subtly reducing the distinctiveness of the typical linguistically Nilo-Saharan East African.

A lot of the distinctiveness of East Asian phenotypes is attributable to the selective sweep of the EDAR gene which is much older than either of the other two examples, but happened, when it did happen, in a matter of a few thousand years, not tens or hundreds of thousands of years.

Genetic engineering, to the extent it occurs, will likewise have its biggest impacts in a matter of a century or two, not thousands of years. And, while the article notes that some geneticists at the time put the prospects of this kind of genetic engineering thousands of years in future, CRISPER technology already available and still in its infancy with lots of room to improve, suggests that this is much less far off than expected in 2007 when it was done, especially for visually striking, but largely cosmetic features that involve only small numbers of genes like eye, hair and skin color, hair texture and curliness, propensity to tan, and freckles.

Genetic engineering will probably start with efforts to actively select against "defects" from Downs' syndrome to bad teeth and vulnerability to specific genetic diseases with simple recessive Mendelian inheritance patterns. Simple cosmetic adjustments that are well understood may follow. And, from there, the chase for enhancements, such as additional cones that allow people to see more colors, may begin. I see those steps taking perhaps a generation or two at a time to progress from one level to the next.

I fully expect the appearance of the average human to be significantly visually different by 3000 CE, not merely tens of thousands of years hence.

Admixture

Sufficiently far in the future, there will be few, if any, people who are phenotypically comparable to modern Europeans, due to admixture.

The world almost surely will see dramatically more admixture over the next few centuries than it has since the Bronze Age. In this regard Betty Crocker has probably been more predictive in forming a composite, mixed race, customer of the company:

For her 75th anniversary in 1996, a nationwide search found 75 women of diverse backgrounds and ages who embody the characteristics of Betty Crocker. The characteristics that make up the spirit of Betty Crocker are: enjoys cooking and baking; committed to family and friends; resourceful and creative in handling everyday tasks; and involved in her community. A computerized composite of the 75 women, along with the 1986 portrait of Betty, served as inspiration for the painting by internationally known artist John Stuart Ingle. The portrait was unveiled March 19, 1996, in New York City.

Better yet is this National Geographic image:

Selection On Standing Variation


Image from here.

The massive intercontinental admixture we are likely to see also reflects another key point of both natural evolution and selective breeding, which is that most evolutionary selection involves selection within the existing range of variation among people, rather than new mutations that prove beneficial, which are the exception.

Indeed, one of the real challenges in the first wave of genetic engineering will be to avoid the temptation to completely remove genetic diversity, biodiversity and neurodiversity from the human species when it seems to be a net minus in current conditions from the standing range of variation in humans only to discover that these traits may have currently unrecognized benefits sometime long in the future. Otherwise, the demise of human genetic diversity could mirror the mass extinction of species and languages on Earth during the late Holocene. 

Lots of the most important bits, like HLA immunity complexes and temperaments better suited to the world of the far future, are also largely invisible a priori although their functional connection to other traits may make that less true than one would expect as discussed below.

On the other hand, genetic engineering can revolutionize that analysis. For example, if we find a gene that is very valuable in extremely genetically dissimilar octopi that is total absent from vertebrate genetic diversity a few decades from now, we might very well use genetic engineering to make that a common variant in the human genome.

Self-Domestication

Humans are likely to continue to select for traits associated with domestication of animals, a process that is already well underway.
Darwin observed that domesticated animals share certain traits across species. Domesticates tend to have floppier ears than their wild counterparts, and curlier tails. They're smaller and have recessed jaws and littler teeth. Domestication also shrinks the amygdala, the brain's fear center, leading to a reduction in aggressive, fearful reactions.

Belyaev noticed that his domesticated foxes eventually developed black and white, or piebald, spots, now known to be a classic sign of domestication. Think of the black and white pelts of cows, horses, dogs, and cats – especially those white-footed felines we claim "wear socks."

The thing is, with the exception of docility, these characteristics don't do anything at all.

Research like Belyaev's made it apparent that if you select for friendliness and cooperation in foxes, you get a host of features that come along for the ride that don't serve a purpose – in evolutionary parlance they're non-adaptive, much like the male nipple. Together this suite of traits is called the "domestication syndrome."

For years scientists have recognized that domestication seems to preserve childlike psychological and physical tendencies, especially those that elicit care from parents and other adults. "Cuter" features. A little more helplessness. And friendliness towards humans, supporting Hare's argument. Recent science has helped piece together why this is.

During vertebrate development there is a strip of what are called neural crest cells running down the back of the embryo. As we grow inside the womb, these cells migrate throughout the body to help form the cartilage and bone of our face and jaw, the melanin-producing cells that give our skin pigment, and part of our peripheral nervous system. They also form our adrenal glands, which, among other functions, release cortisol — our "stress hormone" — and adrenaline, involved in our fight or flight response.

Domesticated animals have smaller adrenal glands. Hare believes selection for friendliness results in less neural crest migration, and as a result, less aggressive, reactionary behavior driven by adrenal hormones.

But fewer neural crest cells reaching their intended targets also influences the other traits driven by their voyage through the body, explaining the smaller snouts and jaws seen in domesticates, and white patches of fur lacking melanin. Scientists now know that domestication — whether artificial or natural — seems to involve selection on a gene called BAZ1B, which helps drive neural crest migration during development.
Baby faces are the future. Other related traits, like enhanced childhood language learning relative to adults, may also be the subject to genetic engineering in the future. William's Syndrome illustrates what this might look like, which bears some similarities to mythical elves:


Technology Facilitates Adaptation And Determines What Is Selected For

Technology can also reduce selective pressure on a lot of obvious physical features, like skin color, with things like Vitamin D supplements and sun screen picking up the slack.

Technology can also determine which human traits merit selection.

For example, for most of human history, food supplies with limited and irregular, and there was strong selection for a capacity to survive periods of famine. But in the modern world, selection is likely to focus on an ability to escape the downsides of obesity that present themselves in a more sedentary world where food is abundant and consistently available.

Physically wild movement and hyperactivity may have been beneficial in much of human history, while the future looks likely to reward a capacity to be still and focused.

Some traits may be the subject to genetic engineering, while other forms of non-genetic bioengineering like nutritional supplements, vaccines, and hormone treatments may also play important parts. Nutrition without genetics can prevent common visible consequences of malnutrition and can enhance height. Everyone in the future will probably be more health overall based upon a variety of genetic and non-genetic methods and cultural adaptations to modern technologies that we are struggling with now.

We might end up with genetic engineering that mitigates nearsightedness adapting to a world where reading fine print regularly is essentially, but if laser eye surgery continues to advance, that may not be a priority as other forms of bioengineering will do the trick.

Self-driving cars and other intelligent safety features in mechanical and chemical things may reduce the selective effects that our current society has against people who impulsively drive too recklessly, cross the street without carefully looking both ways, drink and drive, or otherwise engage in conduct that has grown much more deadly for the average persons than it did in the days where land travel was mostly by foot. Likewise, vulnerability to jet lag might be less important in an era of self-driving cars and AI safety features, when grogginess can be deadly, than in one without them.

Intelligence has lots of value in the modern world, but large infant head size also increases the risk of death to mother and child in natural child birth. But this risk may be alleviated in C-sections are universally and widely available, allowing for larger heads and bigger brains (even though average brain size seems to be declining in recent decades).

Another factor that has impeded brain size development is energy drain. The brain uses 20% of the body's energy demand while only 4% of its volume. But if more energy efficient neurons were genetically engineered somehow, that might make it more workable to have larger brains without the metabolic cost faced by prior vertebrates.

At some point, we might want to genetically engineer humans to have natural immunity to lots of diseases, but in the meantime, improved vaccines might make that a low priority.

The interactions of technology and selective pressures make what the future of humanity looks like difficult to predict. A hundred thousand years out, who knows? Maybe we'll look like this:

Thursday, December 15, 2016

Bacterial Control of Gene Expression And Horizontal Gene Transfer In Pillbugs

Pillbug. Roly-poly. Woodlouse. Doodle bug. This endearing creature, which goes by many names, is common throughout North America and Europe. It seeks moisture, scurries from light, and rolls into a ball when threatened. And it often finds itself embroiled in an evolutionary war with a bacterium. At stake is its very sexual identity.

In pillbugs, sex is determined by two chromosomes: Z and W. Individuals who inherit two Zs develop as males, while ZW individuals become female. But in some populations, these rules are overwritten by a microbe called Wolbachia.

Wolbachia infects the cells of pillbugs and only passes down the female line; only mothers can transmit the bacterium to their young. Male embryos are dead ends to Wolbachia, so when it runs into them, it feminizes them by interfering with the development of hormone-producing glands. The result is that all young pillbugs infected with Wolbachia grow up into females, even those that are genetically male. In such populations, the W chromosome tends to disappear altogether. Eventually, all the pillbugs are ZZ, and it’s the presence or absence of Wolbachia that dictates whether they become female or male.
It’s astonishing enough that a microbe should so totally take the reins of sex determination from its host. But this story, which a group of French scientists have pieced together over the last 40 years, now has an even more baffling twist. 
In the 1980s, the French researchers showed that some pillbugs do not have Wolbachia, but act as if they did. They’re all ZZ, but some still develop as females. The researchers proposed that the bacterium has transferred a piece of its DNA into the pillbug’s genome, and that this “feminizing element”—or f-element—was now dictating the animal’s sexes, even in the microbe’s absence. 
With the technology of the 1980s, the researchers had no way of testing their idea. But 30 years on, Richard Cordaux, from the University of Poitiers, has come very close to proving that they were right.
From The Atlantic.

What if this happened to humans instead of pillbugs?

Horizontal gene transfer and long term symbiotic parasitic infections with subtle behavioral effects are not unknown in humans. Maybe it already happens. We are pretty confident that gender variants like homosexuality and transgender identity are biological and usually arise early in life. But, our understanding of why this happens is murky at best. We know that it isn't simply a matter of having or not having a gene, but a variety of theories are competing to explain why this does happen.'

The only fictional portrayal of a fairly similar phenomena is Greg Bear's 1999 novel Darwin's Radio. But, it would surely be a stunning and confusing phenomena to live through and might take decades to understand, just as it did in the case of the pillbugs.

Yet, there is really no good reason why this couldn't happen to humans just as it did to the pillbugs, some day in the future. Humans need some kinds of bacteria to live, so they can't simply destroy all bacteria in their environments. Too strong an antibiotic in your system can cause as much harm as a mildly harmful bacteria. So, the possibility of this kind of horizontal gene transfer or parasitic control of gene expression is always open.

Friday, October 28, 2016

Extinct Tasmanian Tiger's Brain Analyzed

This is like something out of a Steampunk novel or Jurassic Park. Cool!
The last known Tasmanian tiger (Thylacinus cynocephalus) - aka the thylacine - died in 1936. Because its natural behavior was never documented, we are left to infer aspects of its behavior from museum specimens and unreliable historical recollections. Recent advances in brain imaging have made it possible to scan postmortem specimens of a wide range of animals, even more than a decade old. Any thylacine brain, however, would be more than 100 years old. 
Here, we show that it is possible to reconstruct white matter tracts in two thylacine brains. For functional interpretation, we compare to the white matter reconstructions of the brains of two Tasmanian devils (Sarcophilus harrisii). We reconstructed the cortical projection zones of the basal ganglia and major thalamic nuclei. The basal ganglia reconstruction showed a more modularized pattern in the cortex of the thylacine, while the devil cortex was dominated by the putamen. Similarly, the thalamic projections had a more orderly topography in the thylacine than the devil. These results are consistent with theories of brain evolution suggesting that larger brains are more modularized. Functionally, the thylacine's brain may have had relatively more cortex devoted to planning and decision-making, which would be consistent with a predatory ecological niche versus the scavenging niche of the devil.
Gregory S. Berns, Ken W.S. Ashwell, "RECONSTRUCTION OF THE CORTICAL MAPS OF THE TASMANIAN TIGER AND COMPARISON TO THE TASMANIAN DEVIL" (October 26. 2016).
doi: http://dx.doi.org/10.1101/083592

Monday, October 6, 2014

Personalities Aren't Just For Humans

Recent studies show that animals far removed from mammals have recognizable personalities.

There are extroverted and introverted sharks in the same species.  The extroverts scare away threats by hanging out in groups.  The introverts head off alone to isolated places and rely on camouflage.

There are also aggressive and docile female communal spiders, common in the American Southeast.  Different mixes of personality are favored in different locations, with the proportion of each personality, which are hereditary, providing the first solid evidence of group selection.  The personality proportions remain the same, even in places where the radically different mix of aggressive and docile females found in local spiders of the same species would be more adaptive.

Anthropologists have demonstrated fairly convincingly that there are differences at the level of coherent ethnic, regional and national cultures in what would normally be considered to be personality traits, such as differences between cultural norms in Northern China, which was traditionally a wheat and millet farming area relative to Southern China, which was traditionally a rice farming area.  One of the ongoing debates in cultural history, anthropology and genetics is the extent to which a nation's "national character" is purely a product of cultural transmission, or instead involves (at least in part) differences in population genetic make up with group selection favoring different mixes of personalities in different environments that continue to manifest even as migration and cultural change make old the balancing selection that produced the ancestral mix of personalities of a "nation" dysfunctional in a new environment.  (The evidence concerning the wheat v. rice farming dichotomy in China tends to favor a cultural rather than a genetic source, by the way, which is the leading view, for the most part.)

While some seemingly complex phenotypes really do have complex genotypes as their source, other seemingly complex phenotypes can arise from just a single genetic locus (in the reference, butterfly wing patterns).  There are also a variety of candidate simple genes with apparent impacts on personality.  Scientists have similarly identified a very simple single locus that can be used to make fruit flies homosexual or heterosexual.

The relatively discrete personality categories observed in some species, patterns of balancing selection within groups of personality types, and the fact that, for example, the personality distribution of humans with high IQs is quite similar to the personality distribution of other humans, suggests like unlike massively polygenetic traits like IQ or stature with a quite continuous range of values and strong bias towards the fitness enhancing direction, that a fairly modest number of common genetic variants may account for personality differences.

But, large scale comprehensive searches for personality genes in humans have mostly come up empty.  Yet, there is strong evidence that personality has a strong hereditary component from sources like twin studies that strongly suggests that at least some genotype plays a major role in shaping a person's personality.  As recent studies of schizophrenia reveal, one problem with these studies may be insufficiently precise definitions of personality types to capture the link between genotype and phenotype.

* * * *

Not quite on topic, but also fascinating is a newly discovered species of parasitic ant that evolved from the species of ant for which it is a parasite (something confirmed with a genetic analysis), in the very same ant colony.  This is essentially the insect equivalent of vampires evolving as a new parasitic species of humans within a single human community.

Tuesday, April 2, 2013

Back To Ancient History

The burst of early spring physics conferences seem to be past and so it is time to think about the deep past and population genetics again.

* John Hawks have flagged a number of interesting articles.  One study, linking population size and technological complexity in Oceania at first contact with European sailors reaches a facinating conclusion:
Much human adaptation depends on the gradual accumulation of culturally transmitted knowledge and technology. Recent models of this process predict that large, well-connected populations will have more diverse and complex tool kits than small, isolated populations. While several examples of the loss of technology in small populations are consistent with this prediction, it found no support in two systematic quantitative tests. Both studies were based on data from continental populations in which contact rates were not available, and therefore these studies do not provide a test of the models. Here, we show that in Oceania, around the time of early European contact, islands with small populations had less complicated marine foraging technology. This finding suggests that explanations of existing cultural variation based on optimality models alone are incomplete because demography plays an important role in generating cumulative cultural adaptation. It also indicates that hominin populations with similar cognitive abilities may leave very different archaeological records, a conclusion that has important implications for our understanding of the origin of anatomically modern humans and their evolved psychology.
 
* On the methodology front, someone has found a way to turn W.E.I.R.D. samples into a feature rather than a bug when doing genomics:
Although much is known about college students as a special sample in terms of their behavioral traits such as intelligence and academic motivation, no studies have examined whether college students represent a “biased” sample in terms of their genotype frequencies. The present study investigated this issue by examining the Hardy–Weinberg equilibrium of genotype frequencies of 284 SNPs covering major neurotransmitter genes in a sample of 478 Chinese college students, comparing these frequencies with those of a community sample (the 1000 Genomes dataset), and examining behavioral correlates of the SNPs in Hardy–Weinberg disequilibrium. Results showed that 24 loci showed Hardy–Weinberg disequilibrium among college students, but only two of these were in disequilibrium in the 1000 Genomes sample. These loci were found to be associated with mathematical abilities, executive functions, motivation, and adjustment-related behaviors such as alcohol use and emotion recognition. Generally, genotypes overrepresented in the college sample showed better performance and adjustment than under-represented or non-biased genotypes. This study illustrates a new approach to studying genetic correlates of traits associated with a socially-selected group—college students—and presents the first evidence of genetic stratification in terms of education attainment.
 
* On the "to do" list, one of the projects at the top of list is to look into the circumstances that lead to language formation, which may or may not be distinct from "ordinary" language evolution.  A number of examples and leads to research come to mind to get at it, but full fledged language formation is very rare and mostly prehistoric with the exception of certain creoles and a couple of other outliers.

There are creoles whose formation process is well documented, and short of creoles punctuated language evolution via intense language contact.  There are instances of isolated communities of deaf people developing their own personal sign languages from scratch.  There is a growing literature discussing how societies and subcultures that split off (e.g. the differentiation of the Romance languages, revolutionary Americans, gang members, mother-in-law languages and "women's languages", Urdu v. Hindi) deliberately differentiate themselves linguistically as a means of distinguishing between insiders and outsiders and developing cultural identity.  There are lots of data points on what drives language shift, which language formation requires, but language formation also requires more.

There are purposefully constructed languages or linguistic constructs (e.g. pig latin) although very few of them seem to catch on.  They are like third parties in a two party biased first past the post single member district electoral system.  The viable ones form out of schism in existing ones or are driven by nationalism (e.g. reconstructed modern Hebrew), not to advance intellectually compelling ideas.

I think that the extent to which language formation and change is punctuated rather than evolutionary is greatly underestimated.  But, I'm curious in particular about how more tightly integrated features of a language like phonetics and grammer change relative to less core features like non-core lexical change.  For example, feminism has made some pretty significant changes in these kinds of features recently in English.

In particular, I'm curious about what circumstances might lead to the formation of a new viable language in the modern era.

Tuesday, May 8, 2012

Parallel Human Evolution

More than one mutation that evolves independently can have the same phenotypic consequence. For example, there are at least three separate mutations that confer the ability to drink cow's milk as an adult without ill effect (called lactase persistance), with frequencies that vary from region to region in the world.

More recently, a study quite clearly summarized at Maju's blog has shown that there is more than one mutation that can produce blonde hair.

[T]he gene causing blond hair among Melanesians (and some relatives like Fijians) is not the same as those involved in blond hair in Europe. Mind you that it is not clear yet which are these European genes of blondism but it is clear that the Melanesian allele is not it[.]
In the case of lactase persistance, there is a clear selective advantage to having the mutation, so it isn't surprising that multiple independent mutations that confer the phenotype would emerge.

Another less obvious example of what appears to be multiple, independent genetic causes for the same phenotype which apperas to have been driven by some sort of selective advantage is the set of mutations that lead to short stature in Pygmies and other Asian populations sometimes called Negritos. Rather than having deep origins the pre-date the Out of Africa era shared genetic histories of these populations, it appears that short stature is a relatively recent (i.e. Holocene era) adaptation to life in a marginalized hunter-gatherer population.

In the fairly simple examples mentioned above, only a handful of genes are at work. But, in the case of broad phenotypes like mental retardation, autism, or genetic predispositions to schizophrenia or bipolar disorder, the number of separate mutations that are expressed in ways that are currently indistinguishable in clinical diagnosis are probably in the hundreds or more each, with no single mutation playing a dominant role in any of these conditions at the level of specificity with which they are diagnosed today.

The Limits Of Current Genome Databases

Also instructive is the fact that no gene causing blond hair in Melanesians was present in any of the thousand genomes database individuals. The database can be very helpful in catching the lion's share of genetic variation for traits that are found a more than trivial frequency in large global populations, but rare traits in small populations can still easily be absent in the current set of published human genomes, even though each individual's genome provides hundreds of thousands of datapoints each.

Rich data on each individual in the sample may be useful for efforts to locate the ancestral origins of individuals in relation to each other with considerable precision despite having only a thousand or two individuals (and much smaller subsets for given geographic regions).  But, they don't capture the long tail of genetic diversity in modern humans.

Thursday, January 5, 2012

New Finding Hints At Common Mechanism in Alzheimer's And Autism

The amyloid precursor protein is typically the focus of research related to Alzheimer's disease. However, recent scientific reports have identified elevated levels of the particular protein fragment, called, sAPP-α, in the blood of autistic children. The fragment is a well-known growth factor for nerves, and studies imply that it plays a role in T-cell immune responses as well.

From here.

Abnormal immune function has been noted in children with autism before, but no cause had previously been identified. The new study suggests that this protein that is already a biomarker for Alzheimer's disease may also be a biomarker for autism. Alzheimer's disease is the most well known form of geriatric dementia, although pre-clinical signs of it may start to manifest as early as young adulthood.

Autism is typically first diagnosed in preschool children and narrow definition autism is a characteristic subtype of developmental disability found in 1 in 110 children that disproportionately affects boys that is part of an "Autism spectrum" that is found in more children and at the milder end is sometimes described as a form of mere neurodiversity.

The research suggest that it may be possible in a few years to do a blood test for autism, allowing for earlier diagnosis, which could be helpful if earlier treatments have a better chance of being effective, and could also reduce the risk of misdiagnosis leading to inappropriate treatment.