Feeds:
Posts
Comments

A Tale of Two APOE Alleles

Summary: For people born in the US around 1900, the genetic variant APOE ε4 was associated with longer lifespans. More recently, it has become associated with shorter lifespans and a higher risk of Alzheimer’s disease (AD). This may be because the environment has changed, and the burden of certain infectious diseases such as diarrhea have decreased substantially. If true, this may help us figure out how APOE ε4 contributes to the risk of AD.


A new study by Wright et al uses data from AncestryDNA to investigate the genetic basis of human lifespan. The majority of the individuals in this study (80%) were from the US.

They found only one gene, APOE, with SNPs that had significant associations with both age and lifespan. The APOE SNP they found that was associated with age was rs429358, which changes the amino acid at position 130 of the APOE protein and distinguishes APOE ε4 from APOE ε3/ε2. The APOE SNP they found to be most associated with lifespan, rs769449, is also highly correlated with APOE ε4.

APOE ε4 is, of course, the genetic variant that mediates the majority of genetic risk for AD.

What is particularly interesting about Wright et al’s data is that APOE has a differential effect on longevity based on birth cohort:

Screen Shot 2019-09-14 at 8.07.45 PM.png

Fig 5E Wright et al 2019

As the authors write: “APOE exhibited a negative effect on lifespan in older cohorts and a positive effect in younger cohorts… The minor allele at APOE [read: ε4] was at highest frequency for intermediate lifespan values (74-86 years). This pattern was most pronounced in the younger birth cohorts, and it suggested that this allele [ε4] (or a linked allele or alleles) confers a survival benefit early in life but a survival detriment later in life.”

The authors don’t speculate much about why APOE ε4 has this differential effect on longevity, but I get to speculate: that’s why I have a blog. Here’s my explanation, which borrows heavily from previous conversations I’ve had with the brilliant Dado Marcora.

In 2011, Oriá et al published an intriguing study looking at the effect of APOE ε4 polymorphisms and diarrheal outcomes in Brazilian shanty town children. They found that APOE ε4 was associated with the least diarrhea:

Screen Shot 2019-09-14 at 8.16.05 PM.png

The CDC has this amazing list of the most common causes of death in the US from 1900 to 1998. One of the things that’s striking about this data is how much more common diarrhea used to be in the US as a cause of death. In 1900, diarrhea, enteritis, and ulceration of the intestines is the third leading cause of death:

Screen Shot 2019-09-14 at 8.21.06 PM

But it starts dropping steadily, and by 1931 it’s the 10th leading cause of death:

Screen Shot 2019-09-14 at 8.22.46 PM

After that, it no longer appears in the top 10. My guess is that this is probably mostly due to cleaner water. According to the CDC: “In 1908, Jersey City, New Jersey was the first city in the United States to begin routine disinfection of community drinking water. Over the next decade, thousands of cities and towns across the United States followed suit in routinely disinfecting their drinking water, contributing to a dramatic decrease in disease across the country.”

Let’s assume that what I’m implying is true, that APOE ε4 used to help people in the US live longer by protecting them from diarrheal illnesses that stunt development. If so, it stands to reason that APOE ε3/ε2 might also protect against AD by modulating development.

There is some data to support this. For example, Dean et al 2014 found that “infant ε4 carriers had lower MWF and GMV measurements than noncarriers in precuneus, posterior/middle cingulate, lateral temporal, and medial occipitotemporal regions, areas preferentially affected by AD.” It may be wise to consider more heavily the developmental roots of AD.

Advertisements

One field where the methods of studying postmortem human brain tissue have been relevant recently is adult neurogenesis.

In 2018, Sorrells et al made a splash when they used brain samples from 37 donated brain samples and 22 neurosurgical specimens from people with epilepsy to suggest that neurogenesis only occurs at negligible levels during adulthood. This data seemed to contradict results from rodents.

Screen Shot 2019-09-08 at 2.56.44 PM

DCX staining in rats; Oomen et al 2009 10.1371/journal.pone.0003675

I recently came across Lucassen et al 2018, which critiques Sorrells et al 2018 on a few methodological grounds:

  1. Postmortem interval: Very little clinical data was made available for each brain donor in Sorrells et al, and the postmortem interval (PMI) was one of the omitted variables. The neurogenesis marker DCX appears to be broken down or otherwise be negative on staining shortly after death, so these extended PMIs could cause false negative for DCX staining. Lucassen et al also noted that there might be differential effects of PMI in old and young human brains, for example as a result of differences in myelination.
  2. Cause of death: Lucassen et al noted that certain causes of death, such as sepsis, might be more likely to cause a breakdown of protein post-translational modifications. In the case of the other neurogenesis marker studied, PSA-NCAM, its poly-sialic group might have been lost in hypoxic brains that have substantial perimortem lactic acid production and resulting acidity.
  3. Need for 3d data: Lucassen et al note that the individual EM images presented by Sorrells et al are difficult to interpret because brain cells have complicated, branching morphologies. Instead, they suggest that 3d reconstructions of serial EM images would be more dispositive. Creating 3d reconstructions is often more difficult to accomplish in postmortem human brain tissue compared to rodent brain tissue if the cell processes span a volume that is too large to be effectively preserved by immersion fixation and perfusion fixation is not possible.

I don’t know enough about human neurogenesis, DCX, PSA-NCAM, or the other areas discussed to know if Lucassen’s critiques mean that Sorrells et al’s data truly won’t replicate. But I found the methodological critiques to be valid and important.

Immersion fixation of a human brain is fairly slow. One study found that it took an average of 32 days for single brain hemispheres immersion fixed in 10% formalin to be fully fixed (as proxied by achieving the minimum T2 value).

This means that fixative won’t reach the tissue in the innermost regions of the brain until a substantial amount of tissue disintegration has already occurred.

Here are a few approaches to speed up immersion fixation in brain banking protocols. For each approach, I’m also going to list a rough, completely arbitrary estimated probability that they would each actually speed up the fixation process, as well as some potential downsides of each.

1) Cutting the brain into smaller tissue sections prior to immersion fixation. This approach is the most common approach already used to speed up immersion fixation. It relies on the obvious idea that if you directly expose more of the tissue to the fixative, the process of fixation will finish faster. I list it here for completeness.

Probability of speeding up immersion fixation: Already demonstrated.

Downsides: Damage at the cut interfaces, difficulty in inferring how cellular processes correspond between segments, mechanical deformation, technical difficulty in cutting fresh brain tissue in a precise manner.

2) Using higher concentrations of fixative. This makes biophysical sense according to Fick’s law of diffusion, as a higher concentration gradient of fixative should increase its rate of diffusion into the tissue. One study found that 10% formalin led to a faster fixation rate in pig hearts, at least at the longest time interval studied (168 hours):

Screen Shot 2019-06-25 at 9.26.00 PM

Holda et al 2017; PMID: 29337978; FBPS = Formaldehyde phosphate buffered solution

If 10% is faster than 2% or 4%, then 100% formalin would likely be faster than 10%.

Probability of speeding up immersion fixation with 50-100% compared to 10% formalin: 95%

Downsides: 100% formalin could produce more toxic fumes, it is likely more expensive, and it is not as easily accessible. It could also lead to more overfixation (e.g. antigen masking) of outer surface regions, although it theoretically could reach parity on this measure if a shorter amount of time were used for the fixation.

3) Using the cerebral ventricles as an additional source of fixative immersion.

Screen Shot 2019-06-25 at 9.40.06 PM

If you can access the ventricles of the brain with a catheter or some other device, you could allow fixative to diffuse into the ventricles. This would allow for a substantially increased surface area from which fixatives can diffuse.

Because the cerebral ventricles are already there, using them allows for some of the advantages of the dissection approach without having to cut the brain tissue (other than the tissue damaged when placing the catheter(s) into the ventricles).

This approach can also be used when the brain is still inside of the skull, via the use of cranial shunts.

Access to the lateral ventricle is likely part of why immersion fixation is much faster after hemisecting the brain, which is already commonly done in brain banking protocols. 

Probability of speeding up immersion fixation: 50%. There are plenty of unknowns here. For example, are the ventricles already accessed through the cerebral aqueduct or canal when the brain is removed through the skull in standard immersion fixation? Do the ventricles collapse ex vivo or when the brain is taken out of the skull, rendering the approach much less effective? The uncertainty here should be attributed to my own ignorance of this literature, as other people are likely aware of the answers.

Downsides: Damage to parenchyma where the catheters are inserted, increased complexity of the procedure.

4) Using glyoxal or another fixative as a complementary agent. This is a pie-in-the-sky idea, but what about using a fixative other than formaldehyde? Glyoxal is one possibility. It has potential as an alternative fixative in terms of morphology preservation, and while it doesn’t seem to be quite as efficient a crosslinker as glutaraldehyde, it might diffuse faster because it is smaller. I haven’t been able to find good diffusion time measurements for glyoxal after a brief search. Glyoxal is also likely less toxic than formaldehyde.

Screen Shot 2019-06-25 at 9.57.16 PM

Why use it at all if it likely diffuses slower than formaldehyde? It’s not all about how quickly a fixative agent reaches the target tissue, but how efficiently it crosslinks once it gets there that is necessary to stop disintegration and stabilize the tissue. Glyoxal is the smallest dialdehyde so it might be a bit of a Goldilocks in the crosslinking efficiency vs diffusion speed trade-off. But, again, this is pie-in-the-sky and would need actual testing.

Probability of speeding up immersion fixation: 10% with glyoxal, 90% with some other fixative or combination of fixatives. It seems unlikely — but possible — that the first fixative ever used would just happen to be the best at immersion fixation of large tissue blocks.

Downsides: Other fixatives will likely be more expensive, less accessible, and cause artifacts that are harder to adjust for than the well-known ones caused by formaldehyde.

5) Ultrasound-enhanced delivery. Ultrasound has been shown to increase the speed of fixation in tissue blocks. One study found that ultrasound increased delivery speed of non-fixative chemicals (at the end of a catheter) by 2-3x. The mechanism is unknown, but could involve heat, which is already known to increase diffusion speed (not ideal, as this would also likely increase tissue degradation), and/or acoustic cavitation, a concept that I don’t fully understand, but which can apparently speed liquid diffusion directly.

Probability of speeding up immersion fixation: 50%. I’d like to see these studies done on more brain tissue and for them to be replicated. However, they are pretty promising.

Downsides: Ultrasound might itself damage cellular morphology and/or biomolecules. However, considering that ultrasound has also been used in vivo, eg for opening the blood-brain barrier, it is unlikely to cause too much damage to tissue ex vivo, at least when using the right parameter settings.

6) Convection-enhanced delivery. This technique, which has primarily been used in neurosurgery, involves inserting catheters into brain parenchyma in order to help distribute chemicals such as chemotherapeutic agents. There’s no reason why this couldn’t be leveraged for brain banking as well.

Certain areas of the brain, perhaps the innermost ones that would otherwise take forever to be fixed, could be chosen to have small catheters inserted, allowing local delivery of fixative.

This would allow for an increase in the “effective surface area” of the fixative while minimizing damage due to sectioning and allowing the brain to remain intact.

Probability of speeding up immersion fixation: 99%. It’s hard to see how using convective-enhanced delivery of fixatives with catheters inside of the brain parenchyma wouldn’t speed up immersion fixation, but since I’m not aware of studies on it, there may be some technical difficulties that I’m not recognizing.

Downsides: Damage to the brain tissue from inserting the catheters, potential build-up of fluid pockets of fixative near the catheter tip that could damage nearby tissue if the infusion rate is too high, increased complexity, cost, and time for the procedure.

7) Shaking or stirring the fixative continuously (added 8/18/2019). This will increase the speed of fixative in an analogous way to convection-enhanced delivery: it delivers a pressure gradient, but instead of being inside of the tissue, it is at the surface.

The optimal rate of shaking or stirring is TBD and will depend on various factors specific to the experiment. Among other factors, there is likely a trade-off between such light shaking that it doesn’t have an effect and such vigorous shaking that it will damage the brain tissue due to the translational motion, similar to a concussion.

Probability of speeding up immersion fixation: 99%. This approach makes perfect biophysical sense and it has already been shown to significantly increase fixation speed in freeze substitution. So it should very likely speed up the process of suprazero temperature fixation as well.

Downsides: Concussion-like damage to the brain, increased complexity, possible increase displacement of solutes within the brain.

A new-to-me concept is the idea of an Everest regression — “controlling for altitude, Everest is room temperature” — wherein you use a regression model to remove a critical property of an entity, and then go on to make inappropriate/confusing/misleading inferences about that entity.330px-everest_kalapatthar

My immediate thought is that this is an excellent analogy for one of my concerns regarding regressing out the effect of age in studies of Alzheimer’s disease (AD). It’s such a tricky topic.

On the one hand, not everyone who reaches advanced age develops the amyloid beta plaques and other features that defines the cluster of AD pathology. Whereas there are potentially other changes in brain biology that you will see in advanced aging but not AD, such loss of dendritic spines, epigenetic changes, and accumulation of senescent cells.

On the other hand, advanced age is the most important risk factor for AD and explains most of the variance in disease status on a population basis. Arguably, a key part of why some “oldest old” folks do not have AD are protective factors. There have also been suggestions that accelerating aging is part of AD pathophysiology; although, as far as I can tell, the evidence for this remains preliminary. From this perspective, advanced age in AD is like the high altitude of Everest — it’s one of the key associated features.

So if you are trying to find the effects of AD pathophysiology, for example in a study of postmortem human brain samples, should you adjust for the effect of age or not? This is a practical and tricky question without a clear answer. It probably depends on your underlying model of how AD develops in the first place.

So I think it’s worthwhile to be cognizant of the potential hazards of adjusting for age — namely, that you risk inadvertently performing an Everest regression and removing an important chunk of the pathophysiology that you actually want to understand.

One of the most remarkable findings in aging over the past decade is that it’s possible to track the rate of aging based on stereotyped DNA methylation changes across a diverse set of tissues. These are known as epigenetic clocks.

But as anyone in the gene expression field knows, changes in the levels of epigenetic markers between groups (like young vs older) is confounded by cell type proportion differences between those groups.

This cell type proportion confound makes it harder to tell whether the changes in DNA methylation are truly a marker of aging or whether they are due to cell type proportion variations that may be already known to occur during aging, like naive T cell depletion due to thymus atrophy.

Single cell epigenetics has the potential to address this problem. By measuring DNA methylation patterns within individual cells, you can compare the epigenetic patterns within the same cell type between groups, and don’t have to worry (as much) about overall changes in cell type proportion [1].

I was interested to see whether anyone has used single cell epigenetic profiling, which was just come out within the past couple of years, to measure whether changes in epigenetic marks can be seen within single cells during aging.

First, let’s back up a second and talk about epigenetics. Two of the major factors that defines a cell’s epigenome are its DNA methylation patterns and its histone post-translational modifications.

DNA methylation has been studied a bit in single cells. One study looked at DNA methylation in hepatocytes and didn’t find many differences between old and young cells.

However, as a recent review points out, single cell DNA methylation data are currently limited because of sample quantity within each cell, and can’t easily compare methylation patterns between different cells in the same region of the genome.

On the histone modification front, I found a nice article by Cheung et al 2018, who measured histone post-translational modifications (PTMs) in single cells derived from blood samples. They found that in aging, there was increased variability histone PTMs both between individuals and between cells.

So, in summary, here are some future directions for this research field that it would be prudent to keep an eye one:

  1. How much of the changes in DNA methylation seen in aging are due to changes in relative cell type proportions as opposed to changes within single cells? If we assume that age-related changes in DNA methylation will be similar to age-related changes in histone PTMs, then Cheung et al.’s results suggest that the changes in DNA methylation are probably due to true changes within single cells during aging.
  2. Is there a way to slow or reverse age-related changes in DNA methylation or histone PTMs, perhaps targeted to stem cell populations? It’s not clear that this can be done in a practical way, especially if age-related changes are driven primarily by an increase in variability/entropy.
  3. If it is possible to slow or reverse DNA methylation or histone PTMs, would that help to slow aging and thus “square the curve” of age-related disease? Aging might be too multifactorial for a single intervention like this to make a major difference, though.

[1]: I add “as much” here because differential expression analysis in single cell data is far from straightforward, and e.g. has the potential to be biased by subtle differences in the distribution of sub-cell type spectrum between groups.

One of the common considerations when prescribing haloperidol is whether it will prolong the QT interval. This is a measure of the heart rhythm on the EKG that correlates with one’s risk for serious arrhythmias such as torsades de pointes.

download

Earlier this year, van den Boogaard et al published one of the largest RCTs to compare haloperidol against placebo (700+ people in both groups).

Their main finding was that prophylactic haloperidol was not helpful for reducing the rate of delirium or improving mortality.

But one of their most interesting results was the safety data. This showed that their dose of haloperidol had no effect on the QT interval and caused no increased rates of extrapyramidal symptoms. Their regimen was haloperidol IV 2 mg every 8 hours, which is equivalent to ~ 10 mg oral haloperidol in one day.

The maximum QT interval was 465 ms in the 2 mg haloperidol group and 463 ms in the placebo group, a non-significant difference with a 95% CI for the difference of -2.0 to 5.0.

Notably, they excluded people with acute neurologic conditions (who may have been more likely to have cardiovascular problems) and people with QTc already > 500 ms, which makes generalization of this finding to those groups a bit tricky.

Since I did the same analysis for antidepressants yesterday, I figured that I would analyze the receptor binding profiles of antipsychotics today. Here is a visualization:

Screen Shot 2018-08-08 at 6.24.27 PM

And here is a dendrogram based on a clustering of those receptor affinities:

Screen Shot 2018-08-08 at 6.20.47 PM

It turns out that it’s much harder to see a trend in which these classes cluster based on chemical structure like the antidepressants did, but perhaps you will be able to notice some trends:

Here’s my code to reproduce this.