Testing allergic to common foods such as wheat or nuts can turn your life upside down. No longer can you casually nosh; every meal must be scrutinized for dangerous ingredients. It’s time consuming and stressful, and, worst of all, for many people diagnosed with allergies it may be unnecessary.

As the Wall Street Journal reports, recent studies have cast doubt on the reliability of traditional allergy tests. Food allergies are usually diagnosed through blood tests or skin-prick tests. Blood tests measure the level of IgE antibodies the body makes for a particular food; skin-prick tests look for a reaction on the skin when a certain substance is injected there. However, just because the body has antibodies or shows a reaction to the skin-prick test, it doesn’t necessarily mean the patient will have a reaction when he or she eats the foods.

The only true way to test for allergies is to perform a “food challenge” test under a doctor’s care, eating portions of a suspect food over set periods of time. They’re expensive, they’re lengthy, and they reveal the blood and skin-prick tests to be rather crappy: “In this month’s Journal of Allergy and Clinical Immunology, researchers in Manchester, England, reported that when 79 children who tested positive for peanut IgE antibodies were given food challenges, 66 of them could eat peanuts safely. At the American Association of Allergy, Asthma and Immunology (AAAI) conference last year, doctors from National Jewish reported that of 125 young patients given food challenges, more than half could tolerate foods they’d been told to avoid.”

There’s more: Johns Hopkins Children’s Center found in 2007 that blood tests were both under- and over-estimating a patient’s reaction to foods; and a 2003 report in Pediatrics found that positive results from blood allergy testing was associated with actual allergic reactions in fewer than half of the tested cases.

Image source: Flickr member sneakerdog under Creative Commons

See more articles