Officials once recommended steering clear of allergens like peanuts and shellfish, but studies found that food avoidance didn’t prevent an increase in food allergies. So if early oral exposure doesn’t cause food allergies, then what does? Gideon Lack, a British allergy researcher, explored one theory. From An Epidemic of Absence: A New Way of Understanding Allergies and Autoimmune Diseases.
Lack surveyed British children with peanut allergies and their parents. He could discount one theory right away, he found. Sensitization wasn’t occurring prenatally. No peanut-specific antibodies showed up in blood extracted from these children’s umbilical cords. He did note, however, a strong association with environmental exposure to the allergens—not orally, but through the dermis.
Mothers didn’t know it, but some popular infant creams meant to soothe diaper rash, eczema, and dry skin contained peanut oil. Mothers who used these ointments had children with a nearly sevenfold increased risk of peanut allergy. What’s more, certain soy proteins, it turned out, resembled proteins in peanuts. Both belonged to the legume family. Some ointments contained soy products as well. Mothers using these creams could be cross-sensitizing their children to peanuts without, necessarily, sensitizing them to soy. Horribly, children with the most inflamed skin likely received the most ointment. Parents of the most allergy-prone kids were the most likely to inadvertently sensitize them.
The skin is the body’s largest organ. For terrestrial animals especially, it’s important for keeping moisture in. The epidermis contains waterproof fatty lipids and a hard scaly outer layer called the stratum corneum. The skin also serves as the first line of defense against a range of parasites, from ticks, fleas, mosquitoes, lice, chiggers, mites, and anything else that bites but remains mostly outside the body, to the helminths that burrow through it, like hookworms and blood flukes, to seek our insides. Our immune system is probably inclined, therefore, to treat eukaryote proteins it first encounters in the epidermis as belonging to parasites of one sort or another, and to counter with an antiparasite response—the allergic response.
Eating proteins, on the other hand, usually leads to tolerance. That’s how oral immunotherapy—the process of deliberately training the immune system to tolerate peanuts, say—works. Unless otherwise interfered with, we are preprogrammed to treat proteins coming down our throats as food. Many pathogens and parasites also approach via the oral route, but the gut immune system—the most complex in the body—has ways of differentiating. An approach via the skin, however, is much less ambiguous. It signifies invasion.
So one interpretation of Lack’s finding went like this: children developed food allergies because they encountered food proteins through their skin first, which prompted an antiparasite response thereafter. The problem he’d uncovered was partially one of sequence. The route of first contact mattered. These children weren’t necessarily exposed to allergens too early; they were exposed via the wrong organ. If anything, they needed oral contact earlier, before they encountered the protein via their skin. Worst of all, keeping children away from allergenic food—preventing them from developing oral tolerance—might exacerbate the problem.
Indeed, after officials recommended steering clear of allergens, scientists found that food avoidance failed to curb the increase in food allergies. They continued to increase in both the U.K. and U.S. In 2008, the American Academy of Pediatricians backtracked, revising its guidelines. It now advised breast-feeding exclusively for four months, but after that, it no longer recommended delaying the introduction of any foods.
Parallel to Lack’s work, geneticists were also zeroing in on skin dysfunction in allergic disease, but from a slightly different angle.
THE MUTANT PROTEIN THAT SPARKS THE ALLERGIC MARCH
In 2006, a team of scientists in Irwin McLean’s laboratory at the University of Dundee, Scotland, decoded two important gene variants involved in the skin’s structure and integrity. The gene they pinpointed encoded a protein called filaggrin, an important component of the outer layer of skin, the epidermis. The variants the scientists had identified hobbled production of this protein. People who carried two copies of these “null” genes produced almost no filaggrin. As a result, they developed ichthyosis vulgaris, a condition characterized by chronically irritated flaky skin.
That fact was interesting in and of itself. But an observation by an Irish collaborator, Alan Irvine, elevated the significance of their discovery. In reviewing patient records, Irvine found that people with ichthyosis vulgaris also tended to have lots of eczema. Having insufficient filaggrin apparently predisposed to allergic disease. More important, whereas it took two mutant versions to produce severe ichthyosis vulgaris, just one copy sufficed to increase the risk of eczema.
A flood of studies on “null” filaggrin mutations followed, all of them strengthening the association with allergic disease in general. People with the nonfunctional gene were less likely to grow out of their eczema, and more likely to have asthma as well. They were twice as likely to have allergies to dust mites and cats. In Denmark, the gene predicted the so-called atopic march. Carriers suffered more severe eczema early in life, as well as more asthma and allergic sensitization later.
Get fit. Find nutrition facts. Live a healthy lifestyle. Sign up for our newsletter!