Does it seem to anyone else that the odds of outgrowing a food allergy are a bit of a crap shoot?
The odds seem even, well, odder lately. For example, there's this recent study that seems to promote the idea that kids who strictly avoid are more likely to see resolution of their allergy. What does that even mean, strictly avoid? My son ate next to no processed foods, so I would imagine we were in the "strict avoidance" group. When he was RAST tested as part of the clinical trial, his score was so low he barely qualified. Yet he still reacted to peanut - and now his RAST has climbed back up.
Are peanut challenges as part of these trials a bad idea? If that's the case, though, how does that square with oral immunotherapy and SLIT? What about the kids who are going through the challenges and tolerating peanut? How do doctors ever know when to flip that magical switch from avoid to introduce?
Here's another study of people who outgrew, this time for tree nuts. Some of the kids had experienced severe past reactions, yet they still passed.
Then there was that weird SLIT study earlier in the year, where two of the kids in the placebo group spontaneously saw their peanut allergy resolve. I remember laughing when I got to that section of the write-up. You could tell that the researchers were a little miffed at having to explain the anomaly (and the explanation was basically "hey, we don't know, some kids just outgrow").
Those of us who have kicked theories around for years have hypothesized that there are different types of allergies: some that will be outgrown pretty much no matter what parents or kids do, and those that won't.
But what if there's something altogether different going on here?
I've been fascinated for years by the research that showed kids born in the winter months have a higher incidence of food allergy. This study showed an almost 20% increase in food allergies among fall/winter babies. This one showed an increase of 53%! One hypothesis for this is that when very young babies are exposed to heavy pollen loads, their immune system is more likely to learn to overreact. It's also become pretty clear that Vitamin D is somehow playing a role in this.
Honestly, though, I don't care about any of that. I just want to know how to get my kid to pass food challenges. I'm sure you do as well. So...here's my theory:
What if passing or failing a food challenge depends on the time of year the food challenge is given?
Think about it. If kids are predisposed to develop food allergies based on their month of birth, then perhaps those same factors are still in play when it comes to the waxing and waning of food allergies.
Based on all this, the best time to introduce a new food to the immune system would presumably be January or February. It would be especially effective if the child's Vitamin D level was high at the time, either through a good summer/fall spent playing outside or supplementation.
Our own personal experience does align with this. My son's final clinical trial challenges (the ones where he did well) occurred in October and January; earlier this year, he passed a soy challenge in February. Would he have passed if we had scheduled it in, say, May? Or would his already-overburdened immune system have gone crazy, re-sensitizing him to soy and undoing all our hard work of avoidance?
The uneven results of many of these clinical trials might also be at least partially explained by the periodicity of testing. It would be fascinating to see the results of the last several studies graphed against the months in which challenges occurred.
If your child has passed a food challenge, was it in the winter? If failed, was it in the spring or summer? Leave me a comment!
Follow me on Facebook or Twitter
Seasonality and Food Allergies
By -
September 23, 2013
0
Tags: