The Rise and Fall of Morgellons

MORGELLONS USAIn 2004, frustrated mother Mary Leitao coined the term “Morgellons Disease.” The name came from a 17th century source that described feverish, French children whose backs would break out in stiff hairs. When the hairs sprouted, their coughs and convulsions would disappear.

Leitao’s story started in 2003, when her 2-year-old son, Drew, developed a sore near his lip. He pointed to it and said “bugs.” When Leitao examined it, she found a fiber inside. Soon there were more sores and more fibers–strange threads of all colors. Seeking answers, she began taking the toddler to doctor after doctor. The last doctor she saw, a specialist in Infectious Disease at Johns Hopkins, concluded that what Drew suffered from was a mother with Munchausen’s by proxy. Needless to say, she was not happy.

In 2004, she created a website detailing her son’s symptoms. By 2008,  over 11,000 people had registered on the site to tell their own stories. They, too, suffered from sores and odd fibers. But they added a litany of additional complaints as well: cognitive issues, fatigue, and muscle and joint pain were some common ones. Leitao also started a research foundation devoted to Morgellons. In 2006, it collected almost $30,000 to fund small research projects and promote awareness. Celebrities like Joni Mitchell revealed that they, too, suffered from Morgellons.

People with Morgellons were angry, and the website helped them organize. The typical doctor visit for someone with Morgellons went like this: They would book an appointment, usually with a dermatologist. When the day of the visit arrived, they would bring in a Ziploc bag filled with fibers they had collected from their sores. Often, they would also bring along information they had collected about Morgellons from the web. Presented with these patients, most doctors diagnosed some variant of delusional parasitosis (a disorder in which people believe that bugs are underneath their skin). Treatment for delusional parasitosis typically involves antipsychotics or other psychiatric meds. Not surprisingly, most patients were not happy with the response they were getting from doctors. Sufferers began to demand answers. They started calling the CDC, agitating for an investigation into Morgellons, and enlisted support from politicians like Hillary Clinton, Barack Obama, John McCain, Barbara Boxer, and Tom Harkin. Eventually the CDC bowed to pressure and agreed to research the disorder.

Now all of this may sound crazy. But some responses from bona fide scientists gave credibility to the idea that this disease wasn’t simply about delusions. Randy Wymore, at Oklahoma State university, got interested in Morgellons after reading about it online. People started sending him samples of fiber. He maintained that even though these shipments came from all over the U.S., the blue and red fibers they contained resembled one another. He passed 20 samples on to the forensics team in the Tulsa police department. The forensic lab reported that they were unable to match the chemical structure of the fibers to any of the hundreds in their database, and when they heated the fibers to the  highest temperature possible in their lab (700 degrees Fahrenheit), nothing happened. Wymore and his forensic colleagues were baffled. What could these unearthly fibers be? Wymore also referred some Morgellon contacts to a doctor at Oklahoma State named Rhonda Casey. She professed that she found the fibers embedded under the patients’ unbroken skin. Moreover, she felt the people she saw seemed genuinely ill, presenting with a host of neurological symptoms.

None of that stuff got published. But Leitao teamed up with some colleagues who specialized in treating Lyme Disease and wrote a paper on Morgellons that appeared in the American Journal of Clinical Dermatology. They reported that 79 of 80 Moregellons patients were infected with the bacterium responsible for Lyme disease, and hypothesized that Morgellons could be related. (It should be noted that long term sequelae of Lyme Disease are controversial in their own right.)

So what did the CDC investigation find? In a 4 year collaboration with Kaiser Permanente that cost $600,000, they identified 115 cases. Their analysis was published in PLoS ONE in 2012. The patients were primarily white, middle-aged females. Half of the hair samples tested came back positive for drugs like amphetamines and cocaine. The women presented with a number of neurological complaints (chronic fatigue, cognitive deficits, etc.). But not parasites or mycobacteria were detected in biopsies of their lesions. And those fibers? They appeared to come from cotton.

If this is true, how can we explain the former findings, the ones from the Tulsa PD, for example? What about those strange fibers that were heated to 700 degrees Fahrenheit and remained unscathed? The ones that didn’t match any known fiber? It seems as though they may have resulted from a strange day in the lab. They sort of defy belief, and it makes you wonder about the forensics team there.

After the CDC study was released, the furor surrounding Morgellons seems to have died down. The website Mary Leitao founded has shut down, and the Morgellons Research Foundation has been shuttered. Apparently Wymore and Leitao fell out, and Wymore started his own foundation at Oklahoma State. His research continues, and people who suffer from Morgellons can still register on his website. The Lyme Disease group also continues to publish papers on Morgellons, linking it to infection with spirochetes.

journal.pone.0029908.g004It’s interesting to look at Morgellons as a powerful example of an internent meme, a disease that exploded after a website was created and then began to wane a few years later when the tide of evidence turned against it. After reading about its history, I just ended up feeling terrible for the people involved, though. You can check out pictures of the lesions. Even if they are self-inflicted or the result of bug bites, they don’t look pleasant. One woman reported day after day of agony as her body released red fibers, culminating in a pink worm coming out of her eye and coughing up a fly. Another reported waking up in the psych ward multiple times and becoming addicted to cocaine, driven to desperation by her disease. Moregellons patients try crazy and expensive cures–liquid silver, diatomaceous earth, deworming medication meant for farm animals, high dose antibiotics. These are sad stories. Since we began with Mary Leitao’s tale, you may be wondering how her family is doing. After a while, she reported that her older children began to exhibit signs of Morgellons. Her teenage daughter quit going to school as a result. Since the CDC study, it appears that she has more or less bowed out of the Morgellons community. I hope this family is doing better.

The Arsenic Eaters

200px-Arsenic_trioxideMost people do their best to avoid arsenic. If they end up ingesting it, it’s because somebody poisoned them. Or maybe because their well is contaminated with it. But in the 19th century, a group of people known as the toxophagi who intentionally ate arsenic became a sort of sensation in the medical community.

In 1854, the Boston Medical and Surgical Journal (the predecessor of the New England Journal of Medicine) translated an account of the arsenic eaters, written by a Mr. de Tschudi. The practice of arsenic eating was apparently introduced in Styria (now a part of Austria) in the 12th century. According to Tschudi, peasants called it hedri and took it for a fresh and healthy appearance. In particular, young people would secretly take arsenic to attract the opposite sex–a lovelorn, gangly teenager could become pleasantly plump, with rosy cheeks, and finally get the attention of the town heartthrob. The drawback, of course, is that sometimes someone would become overly enthusiastic about this beauty aid and accidentally poison themselves.

People also thought that arsenic could help them ascend steep hills by making it easier to breathe. If you have ever seen the Sound of Music, you can imagine this would come in helpful in Austria. On a long mountain hike, someone might bring along a little piece of arsenic to chew or  mix into their coffee as a pick-me-up.

A newbie would take just a little at a time. They might start with a lentil sized piece several mornings a week, then work up to as much as eight times that (more than enough to kill most people). Many users reported linking their arsenic intake to lunar cycles, taking less as the moon waned. Once someone was accustomed to this sort of regimen, they reported feeling ill if they tried to go without arsenic. In fact, if the dose wasn’t decreased gradually, the toxophagi said that severe illness or even death could result. What’s more, arsenic eaters could live to a ripe old age, something that could confound the would-be poisoner. Tschudi recounted the story of a servant who disliked his boss. He started to poison her with arsenic, starting with a little at a time to make her illness appear gradual, so it wouldn’t attract suspicion. Instead of getting sick, he was dismayed to see the woman become increasingly healthy. Apparently, the practice of eating arsenic complicated other cases of poisoning too. When a client was accused of poisoning someone with arsenic, their lawyer would raise the “Styrian defense.” They would maintain that the victim was an arsenic eater, and that they poisoned themselves. In some cases, this defense worked, and the accused was acquitted!

5995871Many U.S. and British scientists were pretty skeptical of these reports of arsenic eating, so they started doing research of their own. At one conference, a doctor actually presented two habitual arsenic eaters, each of whom ate 300-400 mg of arsenic in front of the audience. And in case that didn’t convince the assembled scientists, he also presented a chemical analysis of their urine afterwards, demonstrating that it was full of arsenic. Apparently, scientific conferences back then were more exciting than meetings are today.

All of these glowing reports from Styria ended up changing the perception of arsenic here in the U.S., at least for a while. The stories of the Styrian arsenic eaters helped lend credence to the wild claims made by quacks who peddled arsenic-containing tonics. I found this old advertisement for arsenic “beauty wafers.” Don’t you wonder how many women used them?

The arsenic eaters remind me of Wesley, in the Princess Bride, who slowly developed a tolerance to the deadly poison iocane. Only arsenic, unlike iocane powder, is real–and some of these toxophagi really could have pulled off Wesley’s put-the-poison-in-both-cups trick !

Does chelation therapy really help heart disease patients?

therapy-chelation-300x225This March, the results of the Trial to Assess Chelation Therapy, which looked at the effects of chelation therapy on heart disease, came out in JAMA.

What is chelation therapy, you ask? It works like this: a substance that binds heavy metals is given to a patient in order to help him or her excrete these toxic substances. So let’s say that you find your child chewing on lead-containing paint chips. Your first stop should be the hospital, to start chelation therapy. Chelation therapy has also drawn a huge following in the alternative medicine community, though, where it’s thought that many ailments result from heavy metal toxicity. Some parents have been using chelation therapy on their autistic children, in the belief that autism is caused by exposure to mercury via vaccines or other sources. In 2005, a young boy actually died while receiving chelation therapy. Other patients have been using chelation therapy to treat things like heart disease.

The rationale for treating heart disease with chelation therapy is a little confusing. I don’t think anyone believe that a buildup of toxic metals is a primary cause of heart disease. But some proponents appear to believe that EDTA removes calcium from the plaques inside of blood vessels, causing them to shrink, OR that metal ions inside the body produce free radical damage to blood vessels. In the 1990s, trials found no evidence that chelation therapy worked for this purpose. But apparently some alternative health providers have been marketing as an alternative to invasive surgeries anyway, telling patients it will clear their arteries (I borrowed an example of this type of false advertising from Quackwatch–you can see it below).

TACT has been controversial since the moment it began. This trial cost over 30 million dollars and took over a decade to complete. How did a trial this mammoth get started when the previous literature provided little reason to think that chelation therapy would be successful? Well, a proposal for a chelation therapy was submitted to the National Heart, Lung, and Blood Institute in 2000. Not surprisingly, it was rejected. However, the American College for Advancement in Medicine (or ACAM, a pro-chelation organization) and U.S. representative Dan Burton (a proponent of the theory that vaccines cause autism) kept agitating for a chelation trial. And, as it happens, The National Center for Complementary and Alternative Medicine at the NIH issued a very specific call for applications in response. They asked for proposals to investigate the EDTA chelation treatment protocol recommended by ACAM. Not surprisingly, a proposal to do just this was approved, and the TACT study was born.

In TACT, roughly 1,700 patients were treated with 40 three-hour infusions of the chelator disodium EDTA over the course of a year. Not a trivial undertaking for the patients involved! More than half of the sites at which therapy was provided included alternative medicine centers that had been providing chelation for years. Two of them were suspended because of violations. In the middle of the trial, the guy who owned the pharmacy that supplied the EDTA for the trial was indicted for Medicaid fraud. Of the health providers administering the therapy, several were convicted felons, several more had been disciplined by their state medical boards, and still others had been involved in insurance fraud. In addition, one prominent pro-chelation author who admitted that he had falsified data was nonetheless cited a number of times in the TACT protocols. Enrollment for the trial was put on hold in 2008 for an investigation into complaints about things like the centers failing to obtain informed consent properly. And as if all that wasn’t enough, the NIH centers that sponsored the trial weren’t kept blind throughout, as is typical. Instead, they analyzed data periodically throughout the study. So even before the study was complete, it had its fair share of critics.

chelationadWhat did the study’s authors report in the end? Heart attack patients over 50 years of age who got chelation therapy had almost 20% fewer cardiovascular events (mortality, another heart attack, stroke, coronary revascularization, or being hospitalized for angina) than those given a placebo. Now 20% sounds pretty substantial, but the results only just made statistical significance, with a p-value of 0.035. Critics of the study pointed out that, in addition to the concerns listed above about how the study was carried out, it was odd that significantly more patients who received the placebo dropped out of the trial. This is the opposite of what you’d expect–since a treatment is usually associated with more side effects than the placebo, typically patients in the treatment arm drop out at a higher rate. If just a few patients had been treated differently–not hospitalized for angina, not subjected to coronary revascularization–the statistical significance in the study could have disappeared. So if the doctors responsible for these patients were not blinded to their treatment arm, and their knowledge affected the treatment they provided, this may have affected the study results.

So what should we take away from this 30 million dollar, 10 year study? Even if many cardiologists don’t believe chelation therapy resulted in a significant improvement in heart disease patients, and the positive result was spurious, given the popularity of the treatment, some seem relieved that at least it doesn’t seem as though chelation therapy is dangerous. I guess that is encouraging! Overall, though, this seems like a great example of why scientific proposals should be vetted by scientists, rather than being pushed through NIH by lobbyists and congresspeople. It’s not unusual for big, expensive clinical trials to end up with ambiguous results. But in this case, a ton of money and effort was lavished on a study with a very skimpy rationale and major methodological issues–one that couldn’t have gotten past the peer-review process without substantial help from outside. The recent meddling of congresspeople like Lamar Smith in the grant funding process doesn’t bode well for science.

Of dogs and men

a-lot-of-dogs[1]When we cozy up to a dog, or cuddle a cat, or watch a docile cow grazing in a field, it’s easy to forget that the wild ancestors of these animals were not nearly so friendly. Pets and farm animals are so familiar that it’s hard to imagine life without them, but the process of domestication didn’t start until relatively recently in human history. One of the neat things about genetic studies of modern animals is that we are sometimes able to reconstruct their past. In particular, there has been a lot of excitement about potentially revealing the genetic basis for animal domestication. Can a couple of mutations turn a wolf into an affectionate dog? Or does it take a whole slew of genetic changes? Do all domestic animals harbor similar genetic alterations, linked to things like tameness and color? Or does every species become domesticated in its own way? Were most animals domesticated only once? Or does it happen over and over again, in different places, once the idea catches on? Lots of questions about domestication, and the answers are still trickling in.

Recently, a flurry of studies on dog domestication has come out. Over the years, dogs, especially, have gotten a lot of attention from scientists interested in domestication, probably because they love dogs like the rest of us. Although lots of studies have been done, the findings have been a little confusing. And when you consider how tough domestication is to investigate, the ambiguous results start to make sense. Basically, most studies work like this: you get a bunch of dogs and sequence some genes (or genomes, if you are lucky). If dogs from a certain region exhibit a lot of genetic variation, you start to think that maybe this is where they were domesticated. (More variation in one geographic location usually means an animal has a longer history in that spot.) But as it happens, when you do this kind of study, it really matters how you pick and choose your dogs. The more animals that you pick from one location, the more likely you are to find a lot of variation in that region. So if you study a lot of dogs from Asia, you find a lot of variation in Asia, and it seems like dogs must have been domesticated there. If you study a lot of African dogs, then THAT seems like an equally likely site for domestication. On top of that, genetic signatures start to get really murky because domestic dogs sometimes interbreed with wolves. When wolf genes enter a dog population, they introduce new genetic variants–and the greater amount of variation that results can make a dog population look “older” than it really is. I think the latest consensus is that it’s just not clear where dogs were domesticated. It may have happened first in the Middle East or Europe. Or maybe China. There is also a lot of confusion about when dogs were domesticated. Based on studies of nucleotide substitution rates, scientists have hazarded guesses of anywhere from 100,000 YBP (probably way too early) to 15,000 YBP. All of the estimates so far seem to predate agriculture, which began around 10,000 YBP, and both genetic and paleozoological evidence suggests that the dog was probably the first animal domesticated.

OK, so maybe we haven’t had a ton of luck figuring out where or when dogs were domesticated. But that doesn’t mean we can’t learn more about HOW they were domesticated–genetically, that is. Recently, Erik Axelsson, a scientist at Uppsula University in Sweden, and colleagues sequenced the entire genomes of 12 wolves and 60 dogs of various breeds. By comparing the two groups of genomes, they identified 36 genomic regions that appeared to have undergone natural selection in dogs. In other words, under selective pressure to become domesticated, these parts of the dog genome look different from the same parts of the wolf genome. Genes involved in nervous system development seemed to be disproportionately represented among these altered regions of the genome. Since dramatic behavioral changes are some of the first things we think about when comparing wolves and dogs, finding these changes wasn’t terribly surprising. In fact, another study on dog domestication that just came out in Molecular Biology and Evolution concluded that genes expressed in the prefrontal cortex, a region of the brain responsible for complex cognitive behaviors like cooperating with humans during a hunt, evolved rapidly very early in the domestication process. Clearly, our best friend’s brain underwent major changes as we started to spend a lot of time together.

Axelsson’s study also found that genes involved in digestion and food metabolism, including starch digestion, were prominent on the list of dog genes affected by domestication. Since the ability of dogs to thrive in or near human settlements must have involved a big change in their diet (less meat, more starch), this makes sense too. What’s more, another recent whole-genome study on dogs and wolves, carried out by an independent group, identified the same trend: changes in genes involved in starch digestion appeared to be very important for domestication. After finding all these diet-related genetic changes, Axelsson and colleagues suggested that the development of agriculture may have catalyzed the domestication of dogs. This is interesting, since it would put the timing for dog domestication thousands of years later than other genetic studies have estimated. Since dogs being domesticated post-agriculture doesn’t seem compatible with a lot of the other findings, even recent ones, I guess it’s best to take a wait and see approach toward this interpretation.

One of the most exciting things about these recent findings is that they demonstrate that humans and dogs have undergone parallel changes over the course of our shared history. Similar genetic changes to the ones described in dogs have been found in human populations with high-starch diets, for example. And researchers who compiled a relatively comprehensive list of human and dog genes shaped by natural selection identified a substantial amount of overlap. Shared genes mostly fell into two categories: those involved in digestion and those involved in neurological processes. I guess it makes sense. Some people, like anthropologist Peter Wilson, have argued that since the advent of agriculture, we humans have been domesticating ourselves, settling down to live in large communities, eating new things, behaving in new ways. Dogs and people eat a lot of the same things, and we share the same environment. It turns out our genomes reflect our intertwined lives.

The 2013 take on the hobbits of Flores

tumblr_lkg5geJMhG1qagynuo1_500When the first “hobbit” or Homo floresiensis skeleton was found in 2003 in a cave on the island of Flores, it made headlines around the world. But it didn’t take long for the arguing to begin. Did this small skeleton represent a whole new kind of hominin? A petite species that was still around as recently as 12,000 YBP, long after the Neandertals had disappeared? Or was it just the remains of some poor soul with a severe pathology? Scientists tossed around all sorts of ideas about which disorders could result in a person growing to only about a meter tall, with a tiny skull and a curious resemblance to Homo erectus. An Indonesian paleoanthropologist named Teuku Jacob was one of the first scientists who suggested the skeleton could belong to someone suffering from microcephaly. Soon after the remains were discovered, he “borrowed” them, taking them from the center where they were kept and bringing them to his own laboratory. This caused an uproar. Eventually, he returned the hobbit remains to the researchers who found them, but they had been severely damaged. Among other things, the pelvis was smashed and several important bones were missing. As if that wasn’t bad enough, in 2005, Indonesia forbade researchers access to the cave where the hobbit was found. It wasn’t until Jacob’s death a couple of years later that research there was allowed to resume. And the colorful history of the hobbit finds doesn’t end there. Maciej Henneberg, Robert Eckhardt, and John Schofield self-published a book called The Hobbit Trap, in which they called into question the status of the hobbit as a new species. One of their objections was that the teeth showed signs of modern dental work, a claim Peter Brown (one of the hobbit’s discoverers) understandably called “complete lunacy.” Nevertheless, this claim enjoyed a lot of attention from the media. Some people have even speculated that species like H. floresiensis may still be hidden away in remote corners of the world–apparently, rumors of tiny people abound in Indonesia, and in particular, on Flores. I wish we lived in a world where finding another hominin tucked away somewhere seemed like a real possibility!

When the Flores remains were found, the hypothesis that they could have resulted from microcephaly or cretinism was reasonable. After all, when the first Neandertal remains found, people thought maybe they belonged to a Cossack soldier with rickets. In the case of  the hobbit, as in the case of that first Neandertal, there was just the one skeleton–and it’s hard to be sure about a new species designation from a single set of bones. But in-depth study of the skull recovered from the cave demonstrated that its features resembled those of archaic humans. And comparisons to skulls of people suffering from the proposed disorders showed that there wasn’t a good match. Eventually nine tiny sets of remains spanning 3,000 years were discovered in the cave, providing pretty strong evidence that the original find didn’t belong to an isolated individual suffering from a disease. And, early this year, a team of researchers showed that two different Homo floresiensis specimens had wrist bones distinct enough from ours to warrant a separate species designation.

homo-floresiensis-stegodon-florensis-insularisWhat else do we know about the hobbits? Researchers think their short stature may have resulted from “island dwarfism,” a tendency of species to shrink over many generations once they have arrived on an island. A study published this April in Proceedings of the Royal Society by Yousuke Kaifu and colleagues suggests that this idea is reasonable and that the process could have resulted in the hobbit’s body plan. We don’t know much about what life was like for the hobbits, but it may not have been so different from what Homo sapiens were doing around the same time. Possible evidence of stone tools and cooking has been found in their cave. Some scientists believe that a volcanic eruption on Flores may have wiped out both the hobbits and the island’s Stegodon, a species of dwarf, elephant-like creatures that the hobbits liked to hunt. Although Svante Paabo has worked magic in the past, teasing the Neandertal and Denisova genomes out of ancient remains, no one has been able to coax usable DNA from the hobbit remains yet. I am hopeful that the future will reveal additional Hobbit specimens, though, and that one of them may yield DNA suitable for sequencing. Maybe we will find that, like the Denisovans and the Neandertals, hobbit genes live on in us. Wouldn’t THAT be exciting?

Another reason to hate the flu: bipolar disorder

tumblr_mh6nxjOh1I1rog5d1o1_400As if pregnant women don’t have enough to worry about! An interesting paper by a group I used to work with just came out in JAMA Psychiatry: Gestational Influenza and Bipolar Disorder in Adult Offspring. The authors report that children born to mothers who caught the flu during pregnancy have an almost four-fold risk of developing bipolar disorder.

The idea that suffering from an infection during pregnancy predisposes your child to develop psychiatric disorders isn’t a new one. A group that includes a lot of researchers at Columbia University and Kaiser Permanente has been working in this area for a long time. In the past, most of their research has focused on schizophrenia. For decades, people had been bandying about theories on the prenatal environment and schizophrenia (I discuss some of them in this book chapter, in case you are interested). There is a well known finding that birth month affects schizophrenia risk, for example. People born in winter/early spring are more likely to develop the disorder. Scientists and doctors had hypothesized that low vitamin levels or exposure to influenza during a particularly vulnerable period of gestation could explain this link–but without a good dataset, it was hard to get a good feel for the processes at work. Imagine the problem from a scientist’s perspective: somehow, you have to reliably link what happened during pregnancy with mental health outcomes decades later. In the United States, our healthcare system is so fragmented that it’s really tough to do this. Let’s say that I’m diagnosed with schizophrenia at my age: 32. What are the odds that a researcher is going to be able to locate my mom’s health records from pregnancy? What are the odds that they even still exist? And then imagine someone trying to do the same for 100 other people like me AND for many more individuals who don’t have schizophrenia, to serve as a comparison group. Not very likely, right? One solution came from a very unique insurance/healthcare company: Kaiser Permanente.

The Prenatal Determinants of Schizophrenia study started with a group of about 12,000 babies born during the years 1959-1967 in the Bay Area, whose mothers were enrolled in the Kaiser Permanente Health Plan. Kaiser still has all the medical records from their mothers’ pregnancies, as well as their records from childhood and beyond. What’s even more incredible is that biological samples from many of the mothers, taken during pregnancy, are available to test for various things (like antibodies to influenza). The same goes for blood samples from the fathers and, in some cases, even cord blood samples. You can imagine that this sort of setup is a treasure trove for researchers. Decades after these people were born, investigators were able to identify the individuals who had developed schizophrenia. Then, by comparing the prenatal conditions encountered by people with schizophrenia to those experienced by individuals from the same group without the disorder, they could learn more about which factors appeared to increase risk.  This study has yielded all sorts of fascinating findings. When I was working with this group, I found that higher maternal levels of the n-3 fatty acid DHA, which is important in fetal and infant brain development, were linked to an increased risk of schizophrenia in offspring. It’s not clear why this would be–maybe higher levels of DHA indicate that a woman ate more seafood, which could contain mercury or toxins bad for a baby. The data on maternal infections collected in this project are much clearer, though: influenza, toxoplasmosis, respiratory infections, and genital/reproductive tract infections have all been linked to a higher risk of schizophrenia in offspring.

How general are these results? Do prenatal infections also increase the risk for other psychiatric disorders? Or are there different prenatal risk factors for each psychiatric disorder? Recently, Alan Brown (Columbia University) and colleagues have been developing a new study on the prenatal determinants of bipolar disorder, also using the Kaiser Permanente cohort, to get at these very questions. They have identified almost 100 individuals diagnosed with bipolar disorder (their “cases”) whom they can compare to “controls”–people who were also born into the Kaiser Permanente system but didn’t develop bipolar disorder. By combing through old medical records, these investigators were able to identify about 30 mothers who had been diagnosed with influenza during pregnancy. They could then determine if babies born to mothers who had the flu were more likely to develop bipolar disorder. And because they have a wealth of information about the families enrolled in the healthplan, they were able to control for important factors such as socioeconomic status, how old the mother was, whether the mother had been diagnosed with bipolar disorder, etc. This study was particularly exciting because the finding was relatively strong. A lot of times in public health, some risk factor appears to increase the risk of developing a disease by 20% or so. When that happens, you worry that the finding could be spurious–maybe some factor is acting as a confounder and you were unable to completely remove its effect. Here, the results are pretty dramatic. These babies were FOUR times more likely to develop bipolar disorder, and the statistical difference in risk between them and people who were not exposed to the flu during gestation was very significant. I expect we will see this finding replicated in other studies in the future.

As the research progresses, I’m curious to see how the prenatal risk factors for diseases such as schizophrenia and bipolar differ–if they differ. The same genetic profile can predispose people to multiple psychiatric disorders, like schizophrenia, bipolar, and depression. Exposure to famine during pregnancy has been linked to both schizophrenia and bipolar in offspring. At this point, prenatal influenza infection has also been linked to both schizophrenia and bipolar disorder. Are we likely to find any prenatal risk factors that are specific for a psychiatric disorder like bipolar?

One of the things I like about this vein of research is that if these findings are replicated, and we all agree that getting the flu during pregnancy puts your baby at risk for bipolar disorder, there is an easy solution. We know that getting the flu during pregnancy is bad for the mother and baby for a whole host of additional reasons, so it’s already recommended that pregnant women get the flu shot. For those who worry that the vaccine might be dangerous for the fetus or that the shot may not be effective enough to be worthwhile, take heart! This year a paper published in the New England Journal of Medicine showed that getting the vaccine reduced the risk of getting the flu by roughly 70% in pregnant women and that the vaccination posed no threat to the fetus (and may actually have prevented fetal deaths). Many risk factors for psychiatric disorders are not easily changed, but this doesn’t appear to be one of those cases. It’s one of those rare public health stories that could have a happy ending!

The Neandertal in Our Genes

neandertalSpit in a tube, stick it in the mail, and several weeks and $99 later, 23andme can tell you just how Neandertal you are. For the average client of European ancestry, an estimated 2.6% of the genome can be traced back to Neandertal ancestors. If you are one of the people carrying around this vague signature of a Neandertal past, you may wonder what it all means. Does a Neandertal ancestor account for your pronounced brow? Or your red hair? Until recently, nobody really had any answers. Things are starting to change, however. In a recent article in Molecular Biology and Evolution, scientist Fernando Mendez and colleagues revealed a specific gene that has been influenced by Neandertal forbears.

The variants are present in a gene cluster called OAS which plays a role in immunity. Mendez noticed that a variant found in modern humans closely resembled that present in the Neandertal genome, which has been published. And while Neandertals and humans parted evolutionary ways around 300,000 YBP, this genetic variant in the OAS cluster was found to have diverged from Neandertal sequences only 125,000 YBP. Finally, this variant is found in people from Eurasia and North Africa, but not SubSaharan Africa, consistent with the area where Neandertals were once found. Pretty neat, huh?

And this is not the first gene that appears to have come from Neandertals! Last year, Mendez and colleagues identified Neandertal variants in another gene called STAT2. This gene, too, is involved in immunity. Although no one is sure what the functional importance of the Neandertal version of the OAS and STAT2 genes are, one interesting finding is that alternate versions of both of these genes also appear to have entered our genomes from interbreeding with Denisovans, another type of archaic human that is known only from a single bone recovered from a cave in Siberia. This has led researchers to question whether some alleles, like those involved in immunity, may be especially likely to get passed along after a romantic episode with a mysterious stranger belonging to another species. It seems likely that in the future, additional genes that have been passed down from Neandertal ancestors will be identified.

There have been some fantastic articles about the research supporting human-Neandertal interbreeding recently. One was Sleeping With the Enemy, an article by Elizabeth Kolbert that appeared in the New Yorker in 2011–it focuses on Svante Paabo’s work. Another article appeared in Scientific American last month. It was by Michael Hammer, one of the scientists involved in the work I discussed here. It’s called Sex with Other Human Species Might Have Been Secret of Homo Sapiens’s Success, and it’s definitely worth a read.