I'm asking this question from several perspectives.
I do believe that the organic movement is a hoax for several reasons. First, organic food is a lot more expensive than non-organic. It also takes more energy and space to produce. If anyone is familiar with Thomas Malthus, he believed a few hundred years ago that eventually the earth's population will come to a point where there would not be enough food to go around. If everyone were to grow organic around the world, due to the space and money needed, that unfortunately would be true and a LOT of people would starve to death. Not to mention, without the use of biotechnology, many organic crops do not withstand all types of weather and are more likely to perish...again leading to more possible hunger, especially in areas of the world who need it most,
From a chowhound perspective, my question is : does organic actually taste better and is it worth the extra money? Personally, my view on this varies. Occasionally at Whole Foods the "organic" section seems more visually appealing somehow. Does this mess with my brain and make me think it tastes better? On Penn and Teller BS, they did an experiment where they cut THE SAME BANANA in 1/2 and told people that one half was organic and the other was not. The majority of people interviewed strongly believed the "organic" half tasted better.
The same show had scientists claiming that there is no difference nutrient wise between organic or non-organic produce. Also, due to the lack of chemical pesticides, they use "natural" ferterlizer (i.e. manure) which can lead to e coli among other health risks.
My questions to you:
Do you think there are real benefits to eating organic? Do you think the economic costs are worth it? How do you feel about the possible negative consequences?
Do you think that organic really does taste better? Do you feel you are eating healthier?
I'm curuous what you think!