A lot of people take their allergies for granted, but real treatments for allergies are available and they work. I am on allergy shots and my severe cat allergies are effectively cured, along with my lifelong hay fever. My son is doing sublingual immunotherapy for several food allergies including peanut and he is now able to eat several whole peanuts. Before he would react to milligrams; in a couple of years I expect him to be able to eat a PB&J. Luckily his eczema is gone now but if it wasn't we'd be trying Dupixent which is a wonder drug for a lot of people, and treating eczema in infancy is believed to stop the later development of other allergies and asthma as well.
It hasn't been understood until recently (and still isn't known by many practitioners) that these allergy treatments are dramatically more effective and safer when started in the first months or years of life. It makes sense that treating allergies would be easier while the immune system is developing rather than afterward. I suspect that in twenty years it will be standard to start infants on sublingual immunotherapy for all the common allergens (literally putting a tiny, strictly controlled amount of the allergens under the tongue once a day, simple as that) as a prophylactic starting around four months of age, and that will essentially eliminate all of the common allergies in the population. We need more research to get there, but that's the trend I see.
Another strong possibility is we'll finally discover the factors our immune systems are missing in modern life and reintroduce them, possibly in the form of probiotics, and stop most allergies from developing that way. Either way, I hope this is the last generation that suffers from high rates of allergies.
> A lot of people take their allergies for granted, but real treatments for allergies are available and they work.
Source please. I talked to several doctors about treating hay fever (grass pollen allergy) and the consensus was that desensitization doesn't work that often. Success rate is 20-50% or something like that and there's the risk that the body develops other allergies for some reason.
Apparently, my hay fever symptoms are rather strong. Medication doesn't really work. Best is avoidance: staying inside, vacation in countries with less grass pollen, and wearing FFP2 masks outside (helps a lot!). Stuff you don't breathe in doesn't need to be fought by the immune system.
I’m highly allergic to a variety of pollen, grass in particular. Like you, medication is really only a “help me recover faster from exposure” sort of thing. I take it hoping there’s at least placebo effect.
I first discovered my allergies after walking through a pollenating corn field for my agronomist job, and coming out blind and swollen on the other end. I could from then on walk outside and tell specifically when corn was starting to pollenate miles away, because it has a distinct smell that burned itself into my danger detection.
I’ve been doing the allergy shots for ~6 years now. I no longer get seasonal eczema, and I can now even ride an ATV by a corn field without protection. Previously even being within a quarter mile would result in a reaction.
The extreme Fescue and ryegrass pollen here in Oregon’s willamette valley however still require me to use an N95 mask and dust goggles while outside for my job during june and july or I have a bad time.
However, it’s no longer to the point were I’d literally have to move elsewhere or die. My throat no longer gets tight, and it turns out my chronic and increasing eczema was due to pollen and is no longer an issue. I also recover from exposure much more quickly.
Basically the effectiveness of it all is personally mixed depending on the allergen, ranging from fuckall effect to nearly cured, but overall is absolutely worth it.
My recommendation is to have them load up the shot with as many allergens as you can get (20 or so last I checked) and continue with everything even after they say you can remove some of them.
Protip: if you have bad eye symptoms like me where it feels like literal sand in your eyes, applying a very strong menthol/cooling rub on your cheek (not to close to the eyes, should be below the cheekbone) will induce tears and save the day. I use a “kung fu balm”, but tiger balm or vicks vapor rub also work. Various goggles can also be used to occlude pollen as well as humidify your eyes.
Also don’t use nose sprays with oxymetazoline or eye drops with “redness eye reliever” as those are a trap. They constrict blood vessels, but your body quickly adapts and end up in a worse situation than before.
For me those help recovery, but don't help my direct exposure at all. Using a gel type eyedrop and letting that dry can kinda help because it will physically occlude the pollen to an extent.
That said, for people who aren't dumb enough to choose the worst possible matchup between their job and health conditions, those eyedrops should typically help. Alcaftadine (Lastacaft) is another one that recently became available without a prescription.
If you are willing to use it consistently, cromolyn is an old drug that may be effective pre-exposure. As I understand it, it’s essentially useless post-exposure. I’ve been told that it’s safe to stack it with antihistamines.
The ophthalmic form requires a prescription, for some reason.
Desensitization made a big difference for me. I had asthma linked to my allergic rhinitis and I was able to stop taking meds for asthma after (1) desensitization and (2) treating my heartburn which was complicating my asthma.
Turns out I can't take Omeprazole or any PPI for my heartburn because if I take that I don't get a wink of sleep all not. If I avoid NSAIDS completely I rarely need any other heartburn med. I was seeing a doc at the immediate care center the other day for pain in my foot and he asked me "What do you take for pain?" and I told him "Nothing" because NSAIDS cause dyspepsia and my primary care doc thinks Paracetamol (e.g. Acetaminophen, Tylenol) raises my liver enzymes and told me not to take it.
I once in a while take Fexofenadine for my allergies which is the only antihistamine I can tolerate these days. Years ago my allergist told me to use Cetirizine but I discovered that Cetirizine causes CNS effects on me because if I take it regularly and then miss a dose I get terrible "brain zaps" like people describe missing a dose of SSRIs.
My understanding is that immunotherapy is not that well standardized and if you go to different docs there may be differences in protocol that get different results.
There are a couple other second generation antihistamines for you to take a look at. If you're having CNS effects, focus on the receptor binding affinity for anything besides H1 (also look at metabolites).
Effectiveness of allergy shots is generally related to the maximum dose you work up to. Allergy shots are expensive, and I had to get them for years to make my dust allergy manageable. Question you should be asking is how much of the low efficacy rate is due to early termination of treatment.
"risk that the body develops other allergies for some reason."
More plausible mechanism is that people who get allergy shots also get more thorough allergy testing, and discover new allergies they already had. What mechanism would result in new allergy development from small, controlled subcutanous allergen exposure, that would not manifest from continual daily seasonable exposure?
> What mechanism would result in new allergy development from small, controlled subcutanous allergen exposure, that would not manifest from continual daily seasonable exposure?
Good question, I don't know. The doctor said this, I wondered as well, but I didn't ask.
I'm not aware of any increase in risk of developing other allergies from the treatment. It is true that success is not guaranteed, however I believe the success rate is higher than that. Here's a study finding 76% effectiveness in dust mite allergies in adults. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5705479/ And as I mentioned, I expect much higher effectiveness in children.
That said, the treatment does have risks and costs. It's certainly reasonable to decide against trying it. If your allergies cause asthma, Xolair and Dupixent are other options you can try. Combining Xolair or Dupixent with desensitization treatment is an exciting current area of research.
True, Dupixent seems less risky. It really seems like a wonder drug to me. I'm excited for the results of the trials of Dupixent with OIT, especially since it treats EoE.
I was told a similar thing by a primary care doctor in the late 90's. When I finally got fed up with my allergies again in 2014 or so and went to an allergist they were quite confident it was no problem at all to fix, and after several years of shots at various intervals I'm essentially cured of pretty severe grass/tree/pet allergies. At the end of it the allergist said there was a small chance that I might lose some of my immunity after about 5 years, and if that happened we'd simply plan to go back to once-a-month shots. It's been 7 years and I'm not showing any signs of increased symptoms, so according to him my immunity is most likely long-term.
When I first got tested they wiped off the grass scratch test after 2 minutes rather than waiting the normal 20, because I had already swelled up past the maximum amount. I was VERY allergic to grasses. Now I don't even think of them.
Easily the single most life-changing thing I've ever done. HIGHLY recommend visiting an allergist to see if you're a good candidate.
Most treatments, as far as I’m aware, take multiple years for maximum effect. The shots themselves as fairly painless if you have a good practitioner. My son’s shots take about 20 minutes from entering to clinic to exit; 5 minutes of waiting to receive the shot and 15 minutes to ensure there’s no adverse reaction.
It's not a 20-50% success rate, this isn't a 2 week antibiotic course.
Frew (2006) is one of the larger trials, two treatments, one group worked up to dosage of 10k, the other 100k. Higher dosage group had better measures of improvement across the board (excluding lung symptoms oddly), table III.
Oral treatment is really low tech as long as you're not dying from it. For instance, for grass, the pollen is easily available, you can just chew seed stalks and spit them out. I've gone from being pretty miserable 5 years ago to not feeling much in the pollen season.
Yeah, it's possible to DIY in some cases since, as you say, the treatment is low tech, literally just having the allergens in your mouth. But I highly recommend using an allergist, especially if your allergies are severe, where the possibility of anaphylaxis is there. There are other possible rare side effects as well, such as eosinophilic esophagitis. Following a protocol where the amount is strictly controlled and possible side effects are monitored is important for safety.
Plus, if your doctor is very clued-up, they might even know of a scientific study to refer you to where your experiences can better the academic literature about your allergy. Sometimes, you even get paid for participating in such experiments.
The first problem with this would be if you have a severe reaction because you gave yourself too large a dose or happened to be particularly sensitive that day (having an illness or other immune reaction going on).
The other thing I'd question is the efficacy. The standard treatments gradually increase dose to "train" the immune system. Presumably you won't have as fine-grained an ability to adjust the dose you're exposing yourself to, and so you might over or under expose.
> ...treating eczema in infancy is believed to stop the later development of other allergies and asthma as well.
This is a fascinating topic, and one which I think deserves a lot more attention, as it cross disciplines like 'dermatological', 'nutritional' and 'respiratory' medicine; it is related to many parts of the body and so could potentially explain not yet fully understood aspects of our biology.
In a limited fashion though, I think I know what caused my eczema, and what to remove to effectively cure it: stress! My eczema was nowhere near as bad as some people's, but for a while during childhood I saw the doctor about it; I used dermatological pastes daily for weeks and rubbed it off multiple time a day times with cotton wool. However, eventually I gave up as it seemed to make no difference. Since then, it has become apparent to me that it is almost directly correlated with mental stress, which ironically (but perhaps luckily!) means I'm not really bothered by it any longer.
I understand that 'relax' probably isn't going to work on its own if you have eczema badly, but it's interesting that what seems like purely mental stress about entirely non-physical things has such a strong effect on our physiology.
> [...] discover the factors our immune systems are missing in modern life and reintroduce them
Sadly, many commensal bacteria that co-evolved with us have gone extinct due to a variety of modern interventions [1].
Re-engineering the gut microbiome is an incredibly challenging problem but essential to re-establish immune tolerance, which is governed by microbes [2].
"Short-chain fatty acids (SCFAs), which are generated by the bacterial fermentation of dietary fibers, promote expansion of regulatory T cells (Tregs)."
Dietary fiber is a variety of oligo and polysaccharides, certain species of bacteria excel at eating certain types, eg bifidobacteria likes inulin. You can get your gut flora sequenced, speculate on what you're deficient in, look it up, and feed them their favorite food.
Sure, but I don't think things are that simple. It's a huge ecosystem, once it falls into the wrong attractor it is hard to bring it back into a different healthy state. For example, some key bacteria and their symbionts may disappear, or their predators may become really abundant. In this scenario, increasing dietary nutrients to favor certain bacteria might not work.
That is why stronger interventions, e.g. targeted bacteriophage therapies and fecal transplants are something that is getting trialed.
There's a wide variety of bacteria that inhabit the GI tract, with tremendous diversity around the world. The extinction idea is interesting, but I've not seen any evidence that demonstrates anything like it in the literature.
I don't think you are right. There is actually quite a lot of literature demonstrating how species present in hunter-gatherers are not present in the guts of individuals from industrialized societies. See e.g. https://doi.org/10.1016/j.cell.2023.05.046.
Still, even if this was not the case at population level, my point would still hold. At individual level, many species are bimodal. With the wrong nutrients, infections, etc. your microbiome might fall into a state where you have none of them, as they have been outcompeted, and reversing that is incredibly difficult.
Everything changes. That is the problem, we might not see it but disease and even nature itself is in costant flux. It isn't because we are killing it off which is the misunderstanding. If the landscape for bacteria and viruses change we should assume we are just as susceptible to things. We humans like to think our lives as unchanging, immovable, irreplacable and staticly shaped."The 90s lives on!", it is a perfect example that we paint this illusion because it is hard to accept that from this point on everday second, minute, hours... so on will change and alter us. That alteration is irreversible even if we can change or shape the outcome. People are just uncomfortable with this idea.
What you say is true, but 'the 90s' is not really a good example, because biology evolves measurably over millennia rather than decades. It might be hundreds to millions of years between a certain bacteria evolving and then becoming extinct, but in the Anthropocene we have accelerated the evolution and extinction of species by multiple orders of magnitude.
For us to have killed a species of bacteria, discover that it's an important species for our health and then be trying to recreate it, all in less than a century, is too fast even for us. I am seriously concerned that we'll end up killing ourselves before our scientific, social and cultural knowledge starts to compensate for what can only be described as recklessness with just about everything we touch.
You and I are important as persons and want to do loads of things in our lifetimes, but maybe it is worth leaving some things to future generations so as not to overwhelm our own.
> I am seriously concerned that we'll end up killing ourselves before our scientific, social and cultural knowledge starts to compensate for what can only be described as recklessness with just about everything we touch.
Beautifully phrased. I arrived at the same view.
I also find it a plausible variation on [1] as an explanation of the Fermi paradox—natural intelligence learns to undermine its own environment before it realizes it (due to complexity), destroying itself.
> I also find it a plausible variation on [1] as an explanation of the Fermi paradox—natural intelligence learns to undermine its own environment before it realizes it (due to complexity), destroying itself.
Every species, even unintelligent ones, destroys its habitat if it's not kept in check somehow. Herbivores will multiply and eat all of the vegetation if not kept in check by predators, predators will multiply if there's an abundance of prey and eventually be kept in check by famine. The trouble with intelligence is we're able to work around these natural barriers and expand the scope of habitat destruction, which we're seeing with climate change.
> I also find it a plausible variation on [1] as an explanation of the Fermi paradox—natural intelligence learns to undermine its own environment before it realizes it (due to complexity), destroying itself.
With that in mind, I feel as if it's almost a moral imperative for humanity to try its best not to succumb to destruction like that - out of principle, the determination to win a universal game that no species has ever won.
Perhaps it might be even more poetic, albeit cruel, if it really is our destiny to destroy ourselves. To be cognizant of our biggest weakness as intelligent life, yet unable to deviate from the fatal course set out for us by nature would be the ultimate tragedy. I think the classical genre might be more hard-hitting without malevolent gods giving us an excuse for our failures.
One step further is to ask how many people will have thought, as you and I have, these very thoughts before either humanity destroys itself or surmounts and escapes the fetters of its recklessness? Or will we just be perpetually lucky and get close to, but never quite reach destruction?
> For us to have killed a species of bacteria, discover that it's an important species for our health and then be trying to recreate it, all in less than a century, is too fast even for us. I am seriously concerned that we'll end up killing ourselves before our scientific, social and cultural knowledge starts to compensate for what can only be described as recklessness with just about everything we touch.
I am not sure this is all that great example, considering that quite a few very deadly frequent diseases are neither deadly nor frequent anymore. Yes, we need to correct, but we the trade off was not all that bad, actually.
Hmm, two percent of the atoms you were born with, or two percent of the atoms you consist of as an adult? (If former, it really is little, if latter, it's mostly because of the bulk added by normal growth.)
2% of what you originally had according to what some people say.
However I can understand because even slowly the body replaces cells our body disposes of it through various means. Bone also gets replaced too over time, which is weird to think about.
They are a hell of a commitment though, at least the shot-based ones. Once a week you have to go to a clinic, get the shot, wait 30 minutes to ensure no reaction. If you miss doses, you fall behind and have to repeat things. I got about 70% of the way through my treatment then had to travel for two months, after which the thought of starting back up again was daunting and I just gave up.
I brought the vaccine and stored in my refrigerator. The clinic's nurse showed me how to use it by myself. Not sure if that's possible everywhere / for every treatment.
After ten weeks taking shots every Sunday, my allergies got significantly better. I used to be that guy with a perpetually stuffy, running nose, now I can't even remember the last time that happened. 10/10 would do again.
Everything you're allergic to is something you were exposed to at least once; your immune system has to see it and learn to recognize it before you can react to it. Some types of exposure are good and some are bad.
Good types include oral exposure and continuous exposure (as in sublingual immunotherapy). Bad types include skin exposure (especially with eczema) and intermittent exposure with long periods of zero exposure (e.g. seasonal pollens including grass). Timing is also important especially relative to other immune triggers like actual sickness or vaccines or allergens that you already react to.
This is all poorly characterized in general and a lot more study is needed, but we definitely know of ways to expose you to allergens that are much less likely to cause allergies and in fact are likely to reduce or cure them.
“Everything you're allergic to is something you were exposed to at least once”
If you’re saying first exposure is what triggers/causes the allergy, I don’t believe this is true. Some people find out they’re allergic to penicillin the first time they get a dose. If they’re very unlucky it’s fatal.
That's a type II hypersensitivity (IgM or IgG antibodies starting the complement cascade) and it's different from type I allergies (IgE causing mast cell degranulation). Type IV (cell-meditated, lymphocyte reaction) hypersensitivity, like poison ivy, is also different.
And then there's a whole bunch of other hypersensitivities that are poorly understood, like atopy.
> Everything you're allergic to is something you were exposed to at least once; your immune system has to see it and learn to recognize it before you can react to it.
Is this true? From a biochemistry perspective, it doesn't seem like it's necessarily true. Allergies could be genetic, in which case a genetic mutation results in the phenotype where an immune system mis-identifies allergens as harmful.
In biology anything is possible. I'd never rule out something like that. But for all common allergens I'm aware of it is true, you cannot be allergic until after your first exposure. That said, some things are "cross reactive" due to similar proteins, meaning that e.g. you could be allergic to goat's milk even though you've only been exposed to cow's milk. And some of these relationships are less obvious, like cashew and mango or banana and latex.
IgE tests can detect allergies without exposing the patient to the allergen. But they are imperfect. The only true test is exposure.
The test counts as a first exposure. But the reaction can only occur on a second exposure. I've often wondered if allergy tests can cause an allergy and while the allergists I've talked to don't seem to think so, I honestly don't see why not. So I would not suggest getting any unnecessary allergy tests.
My doctor says that tests can cause allergies and is therefore hesitant to give any kind of "test for everything" prick test (the ones with exposure by small cuts in your arm or back). In layman's terms as far as I've understood it, any allergic reaction, but also stress, infection, or environmental factors will get your immune system into a "alerted, defensive" state where it will learn currently present substances as things it will have to fight in the future. This can also include substances on your allergy test that are negative atm, but will be positive in the future because one currently true positive on the test aggravated your immune system.
So prick tests are a risk and you should never just blindly test, rather do the minimum necessary test for what you suspect will be positive, and only if an allergy is suspected anyways and serious enough to warrant the risk of testing.
If you will expose yourself to a substance, you will probably expose yourself to it hundreds of times. E.g. if you eat mussels at all, it might be something you eat now and then throughout your life. If each exposure has a certain probability of causing an allergy to the substance, then every individual exposure has a negligible risk.
Therefore, there is nothing extra risky about making the first exposure an artificial (but guaranteed to be safe) test, if you think you might be exposed in the future for any reason. In return for a negligible additional risk, you get a opportunity to discover if you are susceptible to an allergy before ever trying the substance. What am I missing here that leads to your advice of not "getting any unnecessary allergy tests"?
There are many cases that come to mind where someone has not been exposed to a potential allergen (growing up being sheltered from it, childhood aversion to it that was never reassessed in adulthood, simple lack of that allergen in previous environments) and I think it’s fair to want to avoid someone suddenly having new allergic reactions, especially when those reactions can “suck”.
An allergist I saw had four dogs walking around the clinic all the time. I asked him if that was a problem and he said in his experience, people aren't allergic to dogs, i.e. dander, but rather to the allergens that rest in the fur. Wiping the dogs frequently allows his patients to enjoy the company of the dogs. They were all short haired single-coated dogs.
The saliva, on the other hand, definitely can cause me to break out when a dog's teeth scrape my arm. But if I work with a cat and touch my eye, I get conjunctivitis right away.
Anyhow, the immunotherapy can really work, and definitely made my allergies better, even though it didn't ban them completely. For example, I used to be unable to share a room with a rabbit without an N95. Now I can visit rabbit households without a problem. My children will definitely be getting early allergy testing and immunotherapy if needed.
Never tried dogs. I did grass allergy shots for about 6 months as an adult. All that ever gave me was an itchy arm. My nose -still- becomes a spigot every time I have to cut grass...
6 months may not have been long enough for them to work. And allergy shot protocols are not standardized. Your allergist may not have used a large enough dose or the right updosing schedule. I've had shots at three different allergists (I stopped the first two treatments for personal reasons unrelated to the treatment), and this third one seems to give larger doses (which unfortunately makes the shots more painful), and is also the most effective one by far.
It took about a year before there was any noticeable effect for me. It was only at the 3 year mark when I could tell specifically how much better certain things were.
It it definitely not a quick fix, you need to commit to the long haul.
Edit Yeah, doing out of pocket definitely makes it harder to keep up with. Fortunately my insurance covers most of the cost of my shots.
There are broad-spectrum allergy shots that work by essentially turning off the part of your immune system that triggers allergies (IgE antibodies) while leaving the rest of your immune system (IgG, IgM, etc.) intact. I've been on one called Xolair for almost four years now and it's worked wonders for my copious environmental allergies... dust bunnies hardly make me sniffle now!
> I'm allergic to grasses and dogs, for example, despite growing up in a house with dogs and tons of grass outside.
I believe ramping up exposure dose is an important part of the protocol. You don't immediately jump to the highest dose for many drugs. Exposure to large amounts of allergens regularly just trains your immune system to continue responding as if it's being attacked. The point is to introduce doses that don't trigger a strong reaction so the system learns to ignore it, and then ramp up from there.
> Another strong possibility is we'll finally discover the factors our immune systems are missing in modern life and reintroduce them, possibly in the form of probiotics, and stop most allergies from developing that way.
People who grow up with cats or dogs are a lot less likely to have allergies. They just drag everything in an expose you, I assume. [1]
I was diagnosed with hayfever as a boy and had immunotherapy for it. It was useless. I also had frequent exposure before my diagnosis (without symptoms), which seems typical according to my doctor, because allergies tend to develop /after exposure/.
This, as my anecdotal evidence, and every medical text about it, contradicts your ideas. Please consult a doctor before doing anything you suggested to your children or yourself, because your suggestions are /dangerous/.
Dangerous as in: exposure can cause allergies that haven't been there before. Medical treatment with allergens can also cause allergies and therefore shouldn't been done willy-nilly in a prophylactic manner.
Also, suggestions like those usually bring out the quackery in people, with things like "Oh, hayfever? Let's cure it by doing summer holidays on the farm!" and "Oh, peanut allergy? I have a sandwich for you, kid!".
Clearly you have never seen proper immunotherapy (which has been in practice for decades). All patients are prescribed an epi-pen prior to starting treatment. It used to be that all doses were monitored for anaphylactic reactions but that is now clinic-dependent.
I'm not suggesting that anyone should do anything without the advice of a doctor. Yes, deliberately exposing yourself to things you are allergic to is dangerous and an allergist's guidance is very important for safety. I'm sorry immunotherapy didn't work for you but your anecdotal evidence doesn't invalidate all the published research.
Yes, allergies always develop after exposure. But not all exposure is created equal! Some types of exposure tend to cause allergies while other types tend to protect against allergies. Immunotherapy is the right type of exposure, under the guidance of an allergist.
Well, you suggested prophylactic exposure in early childhood (which is usually before any allergy is diagnosed and usually also before it is manifest):
> I suspect that in twenty years it will be standard to start infants on sublingual immunotherapy for all the common allergens (literally putting a tiny, strictly controlled amount of the allergens under the tongue once a day, simple as that) as a prophylactic
Also, even therapeuthic or diagnostic exposure can cause allergies, there is no safe exposure. Just safer and less safe.
I'm not suggesting that people do that today. I'm suggesting that it's likely where we're headed. But as I said, we need more research to get there. Prophylactic exposure in early childhood is already proven effective and strongly recommended for peanut.
Sublingual IT didn't work for us much.
We are now on OIT and it's night and day. Real milk from day 1, up to 8ml/day now.
Agree with the overall sentiment.
But bear in mind the US is more advanced than many countries, these treatments may not be available.
Even in the US, OIT is done in research setting and not covered by insurance.
>>>>>>>>>>> (literally putting a tiny, strictly controlled amount of the allergens under the tongue once a day, simple as that) as a prophylactic
The english name for this practice is "Hormesis".
A form of hormesis famous in antiquity was Mithridatism, the practice whereby Mithridates VI of Pontus supposedly made himself immune to a variety of toxins by regular exposure to small doses[1] (Kings had to be paranoid about poisons)
It’s astounding that we have an understanding of far away phenomena like black holes and supernovas, but have such a poor understanding of so many aspects of the systems in the human body still
Modern medicine is really primitive compared to the sophistication of biology. I think it will take superhuman AI to fully understand and control biological phenomena.
Different allergies behave differently and some are easier to treat than others. My allergy shots are a 3 year course of monthly shots. Afterward I will supposedly have lifelong protection but we will see. For food allergies personally I suspect that after 5+ years of exposure without reactions you're probably protected for life, especially if you start treatment as an infant or toddler, but there aren't any long term studies to determine whether that's true. Current guidelines for food allergies are to maintain daily allergen exposure indefinitely to maintain tolerance.
Just mentioned this in another comment, but if you have a wide range of environmental allergies then an anti-IgE treatment like Xolair would knock them all out at once, although you have to take it continuously to keep receiving it's benefit. OTOH if your rhinitis is related to nasal polyps then there's other treatments you can look into to (permanently?) shrink them.
That's a leading hypothesis. Research has specifically shown that children who spend time in barns with hay and farm animals are significantly protected from allergies. Even before birth!
It does take a large amount of exposure, though. Something like a couple hours once a month is probably not protective and might even cause sensitization.
> A lot of people take their allergies for granted
This bit get me perplexed.
Severe allergies usually means pills and an Anapen prescription from a specialist that needs to be renewed at least every few years due to the experiation of the pen. If between two visits there was new ways to deal with a specific allergy I'd expect the doctor to guide their patients (at least ask if they're willing to give it a shot).
Are there people out there followed by a specialist for years, subject to a deadly condition that affect their daily food choice, that never got asked if they'd want to get rid of their allergy as a cure was available ? That feels unethitical, borderline criminal to me.
The risk of death from food allergies is often exaggerated. Fatal anaphylaxis is actually extremely rare. Even in people with severe allergies.
Meanwhile the treatments are not 100% guaranteed to work, and they do have significant risks of their own, plus costs both monetary and non. So it's completely reasonable even for people with severe allergies to choose to not be treated. But I agree that more people should be made aware of the options.
> Fatal anaphylaxis is actually extremely rare. Even in people with severe allergies.
If your only metrics is dead people, perhaps. If we look at people ending up in ER in a severe state, there would be a lot more (anecdotally that's how many of us have discovered allergies, from asking other parents dealing with it)
> Meanwhile the treatments are not 100% guaranteed to work, and they do have significant risks of their own, plus costs both monetary and non.
Thanks, I assume there's significant research behind them, so I'd see how a doctor could want to wait a bit more advancement before mentionning it to their patients.
Some common allergies, hay fever or animals, are simply put up with by most people. If they are not dangerous. They make your life difficult, but it's not like you're in pain.
I was miserable every spring from the hay fever. Ironically when you're supposed to crawl out and enjoy nature after winter, I preferred to stay confined. From the dustmites, I had irregular sleep because I would have a clogged nose in bed or severe sneezing.
I'm glad we can depend on the modern medical-industrial complex to provide us with lifelong treatments that rescue us from the problems caused by....modernity.
Going forward, let's take the via negativa. Rather than add more dependencies, which come with complexity and misaligned incentives, let's see what we can remove.
Modernity has unambiguously created a net increase in health. The key here is net increase, which can happen even if some of what modernity brought was negative.
Over time we gradually figure out what the negatives are and either remove or correct them.
These life expectancy stats are deceptive because they factor in infant mortality. In the days of yore, if one survived to the age of five, one's life expectancy wasn't much lower than it is today [1].
Every time I talk to someone that’s on a stack of meds or has gone through multiple journeys using modern medicine they always seem to be the most unhealthy people. It seems like a chicken or egg problem, and maybe it’s just confirmation bias but i much prefer natural remedies. These people seem to be constantly fighting side effects. Too many unknowns when it comes to highly concentrated exotic chemical solutions.
Concluding medicine = bad because it's always the sickest people who are on the most meds is basically a caricature of bad logic. I refuse to believe this isn't satire.
Natural remedies like... taking small doses of the thing you're allergic to in order to desensitize yourself? Because that's what the immunotherapy I'm talking about is. Not "exotic chemical solutions".
It's always the people who decry medicines, i.e. highly pure, intensely tested dosages of chemicals, as "exotic chemicals" while in the same breath advancing "natural medicines", i.e. wildly variable dosages of hundreds of random molecules. Some people seem to think "chemical" is something artificial and bad while "natural" is somehow a rainbow of goodness and not the result of insanely complicated biochemistry.
When my wife and I had children, our respective parents, two very different families (one progressive professional urbanites, the other conservative rural farmers) were both unusually insistent that our babies "get dirty." I can still picture them sitting on the ground in the garden in their diapers, grabbing handfuls of dirt and sticking them into their mouths (they learned quick, though). My mother, a paragon of modern conveniences, was insistent we hand wash our dishes. And you'd better damned well breast feed unless you've got a really good reason not to.
I was raised to be independent and think for myself, and my internal voice was saying, "What's with all this hippie B.S.?"
When asked about it, my parents told us that these ideas were handed down from their parents (coincidentally, farmers and business-people) after generations of observing that kids who grew up "too-clean" got sick more often. Literally, wives-tales. But as an adult, exposed to other families through school and playmates, I saw this for myself. Probably confirmation bias, except for now it's got science behind it.
Funny how some old knowledge becomes new knowledge again.
Wanted to write about this somehow. Having grown up in a third world Eastern European country in the 80s and 90s, playing outside with so many other kids, being around all kind of animals and pets, not caring at all about a sterile environment (eating, playing), allergies were such a distant thing that you would only see in American movies. I never met or knew anyone that would have an allergy until I actually started traveling (I know some may say they were undiagnosed). And later on moving to another country and seeing how many people at work have allergies, is such a weird thing.
I battled chronic pain for the better part of a decade. Every morning it felt like a weight was pressing down on my joints, muscles always tensed up. nights were the worst, with sleep interrupted by sudden jolts of pain. the days became a blur of discomfort, overshadowing moments that should've been happy. I was so damn depressed back then
Thankfully my sister-in-law happens to be an immunologist and had a hunch it might be tied to chronic immune reactions. It turned out that gluten was the problem.
She got me on to t4 phages. which for those unfamiliar, bacteriophages are viruses that infect and kill bacteria.
Anyway within weeks of taking them, i noticed a stark reduction in my pain. it felt like a fog had been lifted.
it’s not a universal solution, and I don't fully understand the mechanism but it's made a monumental difference in my life. I still get flare-ups but at least now they are tolerable
As an aside, my dog also seems to be allergic to chicken, which my vet says it's very common now what with all the highly processed mass produced dog foods
Likely a chronic overgrowth of pathogens (eg E. Coli), enabled by the disturbance in the microbiome via antibiotics, that got reduced by the phages. Pathogens (or imbalances) in the internal biomes can start autoimmune processes: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6300652/
There has been absolutely no legitimate, scientifically proven study that can completely eliminate confounding factors and prove that gluten alone is the source of all these people's problems.
You'd think this community alone would be able to accept that until such a study is done you simply cannot draw such a dramatic conclusion.
Wrecking your microbiome with antibiotic overuse, and alcohol, is where my money is at. Every single person I know who swears gluten destroys their gut, also has a past of abusing antibiotics and or alcohol and utterly destroying their natural microbiome.
Celiac or allergy? I've got celiac, diagnosed at age 47, but it explains a lot of growth issues back to my teenage years. Never heard of phages in relation to gluten issues; that's very interesting. Seems unlikely to help with celiac since that's an autoimmune issue.
Not OP, but have degrees in molecular bio and bioinformatics. It's not hard to imagine a mechanism where OP acquired some got bacteria which consume gluten in the intestine and produce some inflammatory byproduct.
Depending on if this was specifically tested for, the phages could target that specific bacterium, or just broadly many bacteria.
That's a profoundly anti science stance. It's easily falsifiable, because organic wheat does not come in contact with glyphosate. And glyphosate is used widely outside of wheat production.
If gluten allergies were actually glyphosate allergies, it would be trivial to demonstrate.
Gluten allergies exist. Gluten is a protein, and people develop allergies to all sorts of proteins.
Hypothesis > there a tick that causes one to be allergic to red meat because the body confuses the 'bad substance' from the tick with red meat. Could it be a similar situation with glyphosate and wheat.
> organic wheat does not come in contact with glyphosate
Organic crops do have pesticides in them, just in lower quantities. You cannot reverse half a century of pesticide abuse simply by rotating to organic farming in a few isolated places. It's everywhere, including the soil. Even if the farmer doesn't spray glyphosate, it still exists in the environment.
Wheat is also notorious for the amount of pesticide that can be found in it. Glyphosate is just 1 part of the equation, but that is another discussion.
Vegetables do have it, and other pesticides too. But depending on the crop, the application is different, and often stopped (or nearly stopped) by the skin. Leafy vegetables have no such protection, hence why they are almost universally "dirty." Depending on the type and amount, pesticides can and do penetrate the skin, like that of an Apple.
Wheat is a little more special, because it is the most important crop of humankind, full stop. It is also protected by a shell, which gives the false sense of safety. Nuts too contain impressive amounts of pesticides because of the ubiquitous application, and the same perceived sense of safety. However, both nuts, wheat, and other grains hold on to pesticide very well.
If you were asking about gluten allergy specifically, I don't think that's about pesticides at all. At least I just don't see a convincing evidence. Though that doesn't mean pesticides can't interfere with the mechanism. Like practically all things with our food, we simply don't know the answer.
> even meat?
Pesticides are present in meat, though they are in lesser quantities iirc. We have laws against antibiotic use in farming for a reason, they too contaminate meat very easily. Though, at least with antibiotics, if you cook the meat for 45 minutes, their bactericide effect is almost completely broken down. I say almost, because that, we also don't know. The best research that I know was conducted by Iran, so take that with a grain of salt.
Well maybe it’s not only wheat. Alleged gluten allergies are mostly self reported. And it usually takes people years to figure it out. Very plausible that other things cause the same problems sometimes.
This is a bit misleading, though my original post was a bit misleading too, now that I read it again (I was speaking more general, rather than just wheat, but may have misspoke regardless).
Glyphosate famously breaks down after two weeks, but those are for the perfect conditions, and the advice is to increase water to disperse the chemical. Textbook half-life is between 3 days and 19 weeks, though under what conditions, they are a bit hazy. Still, I would be inclined to believe that number when it comes to wheat.
Glyphosate on other crops, particularly trees for example, can take much, much longer to disappear, if at all. Trees that were sprayed 12 years prior can still have glyphosate in their tissues (Canadian study). Root plants are particularly good at retaining and accumulating it.
Isn't it fairly probable that nobody remembers folks with as many deadly or serious reactions simply because kids just died of them?
Like the WWI "helmets cause more head injuries" kind of thing, where it's true head injuries go up, but that's because without the helmet they would all have simply been fatal shots.
What are the odds we're just keeping more kids with more problems alive better than previous generations did, and so the rates of these problems are going up as the kids aren't dying off before they hit five?
Allergy rates are significantly higher in urban areas compared to rural areas.
That wouldn't be compatible with your suggested explanation, unless you think rural kids are currently dying of allergies at a disproportionately higher rate -- which I don't think there's any evidence for.
We moved from downtown in a city to an hour from the nearest store, in the middle of a 20,000 acre expanse of national forest. My life long allergies disappeared inside of a year
I moved from Los Angeles to downtown Dallas and my lifelong early summer hay fever disappeared. Sometimes, the specific plants that bother you just don't grow everywhere.
Why do you think that is related to antibiotic use in infancy? Are urban kids more likely to receive medical treatment as infants? Seemingly, bad air and water quality and exposure to industrial contaminants explains some of why urban people would have poorer autoimmune health than rural. Asthma especially has very straightforwardly tracked automobile emissions.
We do know that kids used to die a lot more, so that is quite possible. But also, it is even more likely that combination of factors is in play - that allergies rates are influenced by environment and also that kids with allergies are diagnozed now.
This also makes me reluctant to get too mad at antibiotics. If you're taking them needlessly, fine, but given I caught pneumonia and missed four months of school in kindergarten, I think I'd have probably died if I'd been born before their invention. I guess this was luckily well after the age at which it seems likely to have any meaningful lifetime impact as postulated in this article.
And for what it's worth, my wife did have an infection as an infant. Whether this is causally related is seemingly impossible to know, but she did end up with all kinds of allergies. Eczema, gluten, allergies to most antibiotics. But she is also hard of hearing because of that infection. Without treatment, she'd be at best completely deaf, and at worst it spreads to her brain and kills her in her crib. While a lifetime with allergies isn't great, it's better than no lifetime at all.
> the most dramatic and visible increase has been the rise in global incidence rates for food allergies, which began in earnest in the 1990s and has grown steadily ever since
Modernity has being going on for a lot longer than the 1990's, so if the data indeed show a well defined break at that time, then at least for some group of allergies there might be a simple cause and effect relation.
But the broader pattern of modern civilization modifying the environment at scale on the assumption that nothing can wrong in the long run if it is not obviously toxic in the short run is indeed the root of many of our growing problems.
It varies from one region to another of the planet. Iraq is an example where all food was organic in the 90s due to sanctions forbade import of most chemicals. You could hardly hear a single case of allergies in the generation of that period. Now most food is imported as the two great rivers drying up, all the younger generations have various forms of allergies, nuts allergy is still virtually absent there though.
Food imports are generally well documented and medical records too. I am sure if people would spend the time one could create a good and detailed dataset to trace the culprits.
Dont expect this to be financed by the food industry though...
> Dont expect this to be financed by the food industry though...
which, honestly is the exact industry that should foot a decent amount of the bill.
along with many of the pressing issues a number of commentators have already brought up, we need to do something about industries who become gigantic simply because they externalize all the negative costs and pretend they had nothing to do with it or just muddy the waters enough that they’re never held accountable.
do we know if food industries are the primary culprits? no, but i’m also not sure it matters. if we allow industries to have significant benefits over everyone else, from writing laws themselves to never being held accountable to externalizing costs onto society, maybe industries should bear the costs of finding out what’s happening and the costs of fixing it.
maybe not tho, but it’s becoming more and more clear how unsustainable it is to let these behemoths externalize everything onto us, walk away with everything, and all but say “not my problem.”
> She blames the antibiotics for altering her children’s gut microbiome and herself for agreeing to the treatment in the first place.
My son had to be administered antibiotics within an hour of being born. He later developed allergies to a wide range of nuts. I don't know if they are related but I always wonder about the connection.
Hope you read the article to the end; I’m sure most people would rather have an allergic kid than no kid at all, which is what antibiotics used correctly enable. Of course one should avoid antibiotics for useless reasons and viral infections but I hope that was obvious even before the allergy discussion.
A sharp increase in nut allergies was seen when pregnant women were advised to avoid nuts, just in case. It actually caused more allergies because the fetus wasn't exposed to small amounts and so never learned to not overreact.
> There are, unsurprisingly, multiple theories about the cause. The hygiene hypothesis is one front-runner, positing that people who are “too clean” develop allergies.
This also breaks down into two versions with very different implications:
1. Not enough exposure to potential threats to toughen up an immune system.
2. Not enough exposure to benign "old friends" we co-evolved with that are almost symbiotic.
Don't know about research, but I have two pieces of anecdata:
The only kid i know of who was raised in clean room conditions became asthmatic around 8-10.
I also know of another kid that got rid of his semi severe peanut allergy by simply eating a peanut crumb every day, then 1/8, then a quarter then... you know the drill.
When I was a kid they diagnosed me and the doctor told me to take one pill every day before sleep.
I didn't like the side effects of the medication but I just put up with it until I was an adult. One year I started to take one pill every two days during the season and my symptoms were bearable, more bearable than the side effects of the medication. A couple years later I started taking one every three days.
Nowadays I take maybe two or three pills a season (4-6 weeks) and only if I feel that an uncontrollable sneeze attack is coming.
And yes I tried different brands of this medication.
The evidence I've seen suggests 2 over 1. It seems like sustained (not intermittent) exposure to diverse things that don't make you sick is good for the immune system, while actually getting sick has no benefit and is actually detrimental, beyond limited immunity to the single strain of the single thing you got infected with (and even that isn't guaranteed).
There are some studies showing a correlation between antibiotics usage in third term pregnancy by the mother and subsequent chronic allergies and development of asthma by the child. This also has led to an advice to change the treatment of urinary tract infection.
> A 2019 study led by Nagler showed that the gut of healthy infants harbored a specific class of allergy-protective bacteria not found in infants with cow’s milk allergy.
Between milk being a significant growth factor for kids and now this data about allergy-proofing, I seriously worry about the public mudslinging against dairy for political reasons. A lot of people are probably misled to think dairy is harmful for their children when it is the exact opposite.
What does milk being a "growth factor for kids" mean? That statement needs to be backed up with some clarity and data.
The majority of the Earth's population is intolerant to lactose, a component in cow's milk, so it's patently false that milk is some fundamental and crucial part of the human diet.
> ...increases in milk consumption over time are associated with large reductions in child stunting even after controlling for important confounding factors. Countries with high rates of stunting should therefore consider nutrition-sensitive strategies to increase dairy consumption among young children...
given how lost a significant populace in even the western world are in regards to nutrition it can surely be considered appropriate here too.
In the geographic area where I live, milk (cow, sheep, goat) was indeed one of the most important food items. And milk intolerance was and still is rare. The milk always also seen as the thing to give to kids so they grow.
And milk was actually practical option, cause you don't have to store it, you don't kill the animal for it, is less work to grow then plants, has tons of calories in a place where getting enough calories is not easy.
> The majority of the Earth's population is intolerant to lactose
I don't believe this is true of infants, people tend to develop lactose intolerance as they mature. Milk is an exceptionally good source of all kinds of vitamins, minerals, fats and even protein. Seems crazy, but studies have found that you absorb milk faster than water when dehydrated.
Your reasoning is erroneous. "Humans do not produce lactase, ergo it's false milk is fundamental/crucial". Humans do not produce the enzymes to digest some dietary fiber, ergo it's not fundamental/crucial. The last sentence is false.
Consuming an excessive amount of any type of oligo or polysaccharide will result in basically a laxative effect, GI discomfort etc.
You've got a few options, consume exogenous enzymes (lactaid pills), avoidance, or colonize your GI tract with bacteria that do produce lactase, and feed them a small amount every day (start at 1 gram of milk in your coffee, add 1 gram each day)
Grew up with copious amounts of Norwegian cow milk - and zero allergies. :) Back then we were free roaming kids as opposed to today, where I now am in Melbourne Aus and it's very very different. However, neither of my children have picked up any either.
The fear for peanut allergies was so big that we introduced peanut butter to them at GP's office with epipen at the ready (very early in their lives). Feels a bit crazy, but that did get them introduced safely and early.
Roaming outside and generally getting your hands dirty in all kinds of soil is also, btw, a potentially significant factor in relation to gut bacteria. That is something I still think the Scandis (and here in FIN) have a somewhat special emphasis on.
I grew up on a diary farm in the UK and spent large amounts of time outdoors and had access to a lot of milk. I was strongly allergic to milk (not just lactose). Was also allergic to other things.
the article talks about it - the best antidote to allergy is early introduction:
> Right around the late ‘80s and early ‘90s, when food allergy rates were starting to increase, the American Academy of Pediatrics said to withhold peanuts and allergenic foods from pregnant mothers, from nursing mothers and from children with risk of allergy until they’re four years old. That was exactly the wrong advice, and that fueled the fire and caused even more increase. Now all of the push is for early introduction.”
of course there are cases of severe allergy, but these seem to be quite rare. as everything today, allergy is also way overdiagnosed by lazy & fearful doctors.
Yeah, our daughter was born with a cow milk protein allergy. She got early introduction via breast milk and it just made her really sick, it was a tough first few months until we figure out the problem. My wife stopped eating dairy and our daughters symptoms were gone in less than two weeks. I hope we can find a cure but early introduction definitely wasn't it.
Same situation here. For us it went away with gradually introducing dairy to her and that was the protocol given to us by the doc here.
She hasn't developed an appetite for cows milk yet (prefers oat), but enjoys many other kinds of dairy now, like cheese, yoghurts etc. without any symptoms.
What I read from the article is that gut bacteria seem to prevent allergy, and that kids with an allergy to cow milk were missing the gut bacteria that would have prevented such allergy.
In this case that food is probably not cows milk, typically being pasteurized before being sold. This is not a debate about macronutrition, it's a debate about the particular distributions of guy bacteria in different people
I dont think the commenter is saying that they are lacking the bacteria. If they have a bacteria that feeds on lactose / milk, and they give pasteurized milk, ,it will promote growth of that bacteria.
Only if they dont have the bacteria at all would pasteurized made a difference.
Anecdote: from my youth I've always been reactive to grass pollen, dust and a few other things with strong and repetitive sneezing. Desensitization injections did not work (circa 1980).
Recently due to getting diabete I switched to a more low-carb/ketogenic diet, up to carnivore-like sometimes. It quickly reversed my diabete (recent science seems to confirm that such a diet change can do this in some cases) but the diet also had the side effect of not having allergic sneezing anymore. I realized it while walking near grass fields and no sneezing.
I switched to only one carby meal (not really carby but "typical carb levels") a day with the rest of the day and it has helped (by a very large margin). the fugue state I would always be in by the end of the work day (say 3 or 4). All my blood work, vitamin levels etc were fine. I just changed by diet. My doctor suggested I try low carb/keto (he's a fanatic over it), but I knew no way I could handle never having bread again so I chose the 3rd option of greatly reducing the number of carbs. Seems to have helped like with you. I am eating a lot more meat than I used to though, I try to stick mostly with white/brown meat and not so much beef/livestock, but everyone loves a good hamburger occasionally.
There are various definitions of "processed" but I think they center on changing the food structure in a way that doesn't even happen with cooking, for example reforming fats or proteins. There's a large set of vegan products based on reforming soy, rice, and other proteins and I consider those to be very processed, and they have gained in popularity.
Food is getting more and more accessible to buy and order - are people cooking less at home?
People ask each other, "which protein powder do you use?". Protein powder is processed food.
What does cantine and school lunch food look like - the everyday food of children all around the world ought to be very important for this.
My personal theory about allergy is that allergies are caused by co-occcurence of mild poisons that are not directly detectable by our immune systems (like air pollution) and harmless, easily recognizable benign things like pollen. Our bodies notice the poison damage but pinpoint the source wrongly and trigger responses that evolved in us to help us get rid of actual fitotoxins.
I've had eczema for 40 years. It's gotten much more manageable with age, but I'm still allergic to a few things, such as dogs.
I'd much rather be alive and dogless, than dead in the first year as was common before modern standards of hygiene.
Modern life has bugs, but modern technology can help produce bugfixes. Perhaps in the next few decades it will become common practice to expose babies to a range of relatively harmless pathogens to train their immune system, just like we vaccinate them now against more deadly diseases. Perhaps as we learn more about our own gut flora, we will develop a way to add or subtract specific strains of bacteria to bring about a desired result, instead of eating random "probiotics" to hack the immune system in the dark.
I'd be much more optimistic about such solutions gaining traction, both scientifically and commercially, than the prospects of telling everyone to go roll their kids in dirt. That's just the way modernity works. It doesn't like to stop and look back. It barges ahead, update after update, fixing bugs as it goes along.
And modernity sometimes ignores some obvious considerations.
I am surprised that a long article attempting to find links between various manifestations of modernity to allergy, it fails to mention vaccine.
The diet, exposure to pollutants, sun, level of cleanliness etc certainly has evolved over time. But none of these have changed as dramatically, become prevalent and directly alters our immune system like vaccine does.
I don't know of any study that's been able to explain how vaccines may increase the risk of allergy. However, if we are to speculate about possible causes for allergy, it should be nearly at the top. And the omission of even a single mention is quite striking.
The fact that we cannot even talk about something so obvious should pause some of us to wonder, what is it that is causing this kind of blind spot in modernity. And that should also lead us to ask who is trying to hide this for what reason.
> The fact that we cannot even talk about something so obvious should pause some of us to wonder, what is it that is causing this kind of blind spot in modernity.
It is not at all obvious to me. For one, the timing of the introduction of vaccines does not match the timing of the allergies becoming widespread. Or are you proposing that a specific vaccine could be the link?
I've never heard of this before your comment. It certainly seems like a novel theory. Before trying to ask who is trying to hide this, why don't you propose or fund a study?
It definitely can't be the immune system altering drugs we give to children in modern society, the likes of which have exploded in usage since the early 1990s. Nope, certainly not even worth mentioning here.
But are you sure you aren't just trying to be contrarian and fishing for data?
Vaccinations were as early as the late 1800's and the first widespread programmes for yellow fever in 1930's.
Most of the allergy/asthma incidence begins to spike in the 1970's.
Excessive use of disinfectants, and excessive use of antibiotics is a much more likely mechanism.
Antibiotic resistance was precipitated by excessive antibiotic prescription in both humans and the animals humans eat.
coincidentally the increase in antibiotic resistance rising from the 1970's onwards is very much aligned with allergy anaphylatic reactions rise from 1970's onwards.
You can fix your immune system to stop overreacting to allergens with allergy immunotherapy. Especially for inhaled allergens (pollen, pet dander, dust), this treatment is proven.
It's available in the US through insurance with painful shots in allergist's offices and at-home through under-the-tongue tablets and drops with digital health [1] or at-home through university hospitals [2]. In fact, the under-the-tongue option widely used in Europe, where insurance doesn't dictate patient care.
The gene pool is pretty much identical (at least for whites, which is a fairly large sample of the US population anyway) in Europe and in the US so the massive increase of obesity in the US is certainly due to specific factors that are specific to the US in terms of lifestyle or food products.
there are a LOT of countries all over the world with obesity problems, US is just high up the list, but we are very much not alone. So let's stop blaming the US for everything, OK?
1) Live in a much more sterile world. Where our immune systems used to get a workout, now it just sits on the (metaphoric) sofa all day, watching TV, while drinking soda and eating junkfood. Along the same lines, the cleanliness means less diversity in gut bacteria.
2) Live much more silo'ed. Hunter gathers had families that where together more often - hunting and/or gathering, eating and/or socializing around the fire. Less interaction means less exposure to bacteria and virus (read: immune system doesn't get stronger), as well as less exchange of what ultimately becomes gut bacteria.
I returned yesterday from a hospital stay. Before going in, I had been having ever-worsening asthma and related symptoms, which stopped when I ... stopped taking my asthma medication. Generic medications in the U.S., practically speaking, are allowed to have more toxic contaminants. Even more so for generics produced outside the U.S. Many articles about this are titled something like "our medications are poisoning us". I point this out because it's a factor of some significance not mentioned in the article, as far as I can see.
Therefore it can't be called modernity as it makes it sound progressive, instead its taken us back, made us allergic, more depressed and stressed. But atleast we can have nice cars and houses hey. Oh wait.
I grew up in rural N. Georgia with a lack of modern amenities and I long for it every day. I think people miss out on a dose of the woods now and again.
On human allergies, sinus rinsing works. Technical medical analysis: it clears the crap out of your sinuses. Once a day or at most twice.
(If you've never done this, you're probably remembering inhaling some water when you're swimming. It's not like that; the saline and bicarbonate make it not burn your tissues.)
Dogs: almost all the dogs I hear about have some kind of allergy. Mine was constantly licking his paws. There's a narrow-spectrum drug, Apoquel, that works for him.
where people had a very different discussion that was overall quite negative compared to here, leaning in the direction that the article was alt-health hokum. I'm quite fascinated with the differences between this community and that community, particularly in that "Ask HN" is usually a disappointment whereas linkless discussions thrive on Tildes.
Serious question because I haven’t had time to dig into it and I have gotten some great answers to biology questions here in the past.
There is a theory going around the internet that the adjuvant component in vaccines which enhances the learned immune response to the viral matter in the vaccine may sometimes (unintentionally) also enhance the learned immune response to other materials in the body around vaccination time, which results in an “allergy” to those other material(s).
Logically speaking this doesn’t sound crazy. Has it been studied and disproven? I’ve seen a few “debunkings” but they all seem to misunderstand the theory and dismiss it out of hand.
Apparently there are no known mechanisms by which vaccination in general might trigger autoimmune disorders, and statistically speaking, this is also not happening:
Thanks for the link. Unfortunately though I do not believe it is relevant to my question. Allergies are an immune system disorder, but they are not an auto immune system disorder (unless the human body has started producing pollen without my knowledge). The article does not mention allergies as far as I can tell (probably for this reason).
Allergies and auto-immune disorders are "expressions" of the same underlying process. Dysfunction in regulatory t cells, part of the adaptive immune system that "trains" the adaptive immune system on what to "attack".
Sure, in the same way that busses and porcupines are expressions of the same underlying process. There are also some key differences, or else we would expect a similar prevalence of autoimmune disease as allergies, no?
Allergies and Auto-immunity are both pathologies of the adaptive immune system "incorrectly" targeting things. Regulatory t cells, "tregs", modulate this.
My point is that while there may be some broad commonality (with physics being a hyperbolic commonality in the example I gave), this does not mean that there are no differences and that we can always infer information about one from information about the other (though in some cases we can).
BACH2 is one of the gene's being investigated for it's role in allergies & auto-immune disorders, and tregs, which is why I keep bringing them up on this thread.
I think what you are implying above is that we can take a study or survey about autoimmune disease and use it to make conclusions about allergies. All I am saying is that this does not logically follow, because the two are not the same and “allergy” is not a strict subclass of “autoimmune disease”. The specific details of the mechanism are not relevant to the logical point unless those details establish a subclass or identity relationship (which here they do not).
Ah sorry, from your quote I understood that you’re talking about actual autoimmune disorders:
> also enhance the learned immune response to other materials in the body around vaccination time, which results in an “allergy” to those other material(s).
I mean, those are literally “allergies” to materials in your body.
Right, IIUC the adjuvant being present alongside the e.g. inactivated virus causes the immune system to react more strongly because the adjuvant is a harmful substance (though in small enough quantity that it is harmless overall, of course).
The obvious question is - how does the adjuvant specifically target the inactivated virus while avoiding having the same effect on anything else in the body? AFAIK the adjuvant substance itself is not chemically complex in a way that it would have the ability to target at all.
Leeches. A four month therapy made my allergies better by magnitudes.
I had problems with my knees and after being through basically all kinds of normal therapy I tried unconventional remedies. Nothing like homeopathy but natural remedies. In this case three medical leeches applied to my knee area every month for four months in total.
Knee still hurt as bad as before. But I suddenly noticed that it was April already and I didn't need any Zyrtec. In that season I went from taking around 100 before to 0. The allergies are not completely gone but are merely a tiny nuisance now, not in need of pharmacological intervention.
And yes it was bad. I had anaphylactic reactions when I was around 7. And the worst one at one of the arguably best places to have one, Eurodisneyland. Doctor there came in two minutes tops.
When I was 10 or so I started on immunosensibilisation by shots in the arm. Did that for around 3 years per allergen and it did help somewhat. But nothing compared to that leech experience.
Yes, this is an anecdote, but this was 10 years ago and I still get to enjoy my summers now.
Just n=1 but I grew up in the country playing in the woods, creeks, dirt, handling bugs and reptiles, surround by tons of flora and fauna. I rarely get sick and only have one allergy that I know of. Meanwhile it seems like many of my friends that grew up in the city proper always complain about allergies, getting sick, covid wrecked them, etc. I never caught covid either (I was vaxxed and boosted a couple times). I think there may be some truth to the exposure theory, especially for the youth
I got my allergies 13 years ago, as an adult, it's still very fresh in my memory.
Walked downtown of a Swedish city on a hot and humid summer day, I remember the air when I passed through the city park was so thick with pollen that it smelled musky.
A couple hours later I was walking home and my nose started running, yet another couple of hours later my entire face was gushing liquids so bad I could barely see where I was going when I headed to the pharmacy for antihistamine.
Before this I had never had an allergic reaction, to anything.
According to doctors my allergies are supposed to be timothy grass pollen, birch pollen, dog and cat fur. (I took care of a dog for 10 months before this with no symptoms.)
Now 13 years later I have made some observations that might help others with their allergies, as a layperson.
The first few years were the worst, but it gets better! I used to have to take pills for at least a month every summer, making me drowzy and essentially ruining summers for me.
13 years on and this summer I've taken 0 pills, only used my nosespray 5 times maybe.
I've had a Jack Russell dog for 6 years now, constantly by my side, in my home and even in my bed. It seems as if I've gotten used to my dog's dandruff because if I visit someone with a dog I break out into sneezing and runny nose within a few hours in their home.
I make sure to feed her food that will promote a healthy and moisturized skin, because that's where the dandruff comes from, the fur is just the medium.
That's also an important detail that my allergies are worst indoors, outdoors I barely feel them except for the occasional multi-sneeze, twice a day maybe.
Another interesting detail is that my allergies are mostly present in Sweden! I recently moved to Croatia and the 5 times I've used my nosespray this year have all been while visiting Sweden.
My layperson theory is that there is too much grass in Sweden. They focus a lot on planting and maintaining grass in cities. It's nice but I don't think it's naturally balanced. In Swedish nature grass meadows are always uncut, and always interspersed among forrest areas. In cities large grass meadows are regularly cut and interspersed by parking lots and buildings.
There is nowhere for the pollen to go, every summer there are swathes of pollen lining the streets after any rainfall.
Down here in Croatia there is a lot less of that. It's not non-existent, I remember seeing the largest examples of timothy grass ever here. Once we took a wrong turn and ended up driving through a field, lots of pollen stuck on the radiator of the car, when we got out of the field I broke out into the worst allergic reaction I ever had down here. So the timothy grass is present, but I believe it's managed better by nature, as long as I don't stick my face in a field. ;)
So this article doesn't cover city planning, it's more about chemicals, but I believe city planning plays a huge role.
Modern life seems so paradoxically familiar that it obfuscates the environment we evolved 300.000 years ago.
So, for starters: 1000 new chemicals [0] are synthesized each hour (!) with little to no environmental oversight over them.
Our microbiome (protists, bacteria, archaea, fungi, viruses) seems to be a good proxy of how far we have strayed from the richness of our biochemical "informational flow" present 99.9...% of our time (evolutionary history).
In H.G. Wells' The War of the Worlds (1898) the Martians are killed off by earthly pathogens or simply incompatibility.
As noted in the article with H. pylori; "pathogens" are more often than not the result of an disturbed equilibrium, the modern world we indulge in is increasingly making us incompatible with earth's biochemical interface. By then trying to single out disease vectors the overall context gets completely lost. By emphasizing the biological role some perspective can be gained:
"Viruses" are powerful accelerators [1][2] for an otherwise slow mutation rate.
"Parasites" are relatively new connections which over time resolve themselves to a symbiotic relationships [3].
Even the much dreaded "parasitic worms" (helminths) have a great potential for immunotherapy [4].
In a sense this disconnect slowly began with agriculture/pastoralism in an effort to maximize the yield the environment turned hostile on us and so conceptually we introduced "pests" and declared war.
A successful model: we are now extracting resources on a globalized industrial scale. Our powerful tools have changed the landscape yet we are basically operating under the same scarcity mindset of an farmer 10.000 years ago: We earn our bread to support our family and make our sacrifices (pay taxes/tributes) to shield us from the wrath (anarchy/forces of nature) of the gods.
And indeed our human bandwidth to the biosphere becomes increasingly smaller while we hope - in that imagined scramble inherited from our ancestors - for an unlimited bandwith (e.g. to the spiritual world of eternal silicon) to free us from the tyranny and bondage of biological systems.
It hasn't been understood until recently (and still isn't known by many practitioners) that these allergy treatments are dramatically more effective and safer when started in the first months or years of life. It makes sense that treating allergies would be easier while the immune system is developing rather than afterward. I suspect that in twenty years it will be standard to start infants on sublingual immunotherapy for all the common allergens (literally putting a tiny, strictly controlled amount of the allergens under the tongue once a day, simple as that) as a prophylactic starting around four months of age, and that will essentially eliminate all of the common allergies in the population. We need more research to get there, but that's the trend I see.
Another strong possibility is we'll finally discover the factors our immune systems are missing in modern life and reintroduce them, possibly in the form of probiotics, and stop most allergies from developing that way. Either way, I hope this is the last generation that suffers from high rates of allergies.